Error Importing Azure Bacpac File to Local Db Error Incorrect Syntax Near External

Error Importing DB from SQL Azure to a localDB

Can you copy that Azure SQL Database with a new name using Azure portal as explained here?

Sample Image

Remove the external data source from the new database and then export it as bacpac. Delete the newly created database with the copy operation. Import the bacpac created to your localDB instance.

Importing Azure .bacpac to my local 2014 SQL server no-longer working

You need to:

  • Install SQL server 2016 and import into that because the database scoped credential feature is only supported in SQL Server 2016 and Azure SQL databases. Otherwise you might see:

Error SQL72014: .Net SqlClient Data Provider: Msg 195, Level 15, State
10, Procedure spX, Line 18 'TRIM' is not a recognized built-in
function name.

  • Update DAC\bin\SqlPackage.exe because there is a known compatibility issue otherwise you will see:

Unable to connect to master or target server 'copyX'. You must have a user with the same password in master or target server 'copyX'.

  • Remove SQL credentials from .bacpac model.xml and update origin.xml hash the long way or the short way using a powershell script. Otherwise you might see:

Error SQL72014: .Net SqlClient Data Provider: Msg 102, Level 15, State
1, Line 1 Incorrect syntax near ';'. Error SQL72045: Script execution
error. The executed script: CREATE MASTER KEY;

Open a normal command prompt window and type:

cd C:\Program Files\Microsoft SQL Server\150\DAC\bin

Then type

.\SqlPackage.exe /Action:Import /SourceFile:"C:\path\MyBackUp-patched.bacpac" /TargetConnectionString:"Data Source=ONO2012-LAPTOP\MSSQLSERVER01;Initial Catalog=copyX; Integrated Security=true;"

Cannot import SQL Azure bacpac to 2016 CTP

Since this question was also asked and answered on MSDN, I will share here.
https://social.msdn.microsoft.com/Forums/azure/en-US/0b025206-5ea4-4ecb-b475-c7fabdb6df64/cannot-import-sql-azure-bacpac-to-2016-ctp?forum=ssdsgetstarted

Text from linked answer:

I suspect what's going wrong here is that the export operation was performed using a DB instance that was changing while the export was on-going. This can cause the exported table data to be inconsistent because, unlike SQL Server's physical backup/restore, exports do not guarantee transactional consistency. Instead, they're essentially performed by connecting to each table in the database in turn and running select *. When a foreign key relationship exists between two tables and the read table data is inconsistent, it results in an error during import after the data is written to the database and the import code attempts to re-enable the foreign key. We suggest using the database copy mechanism (create database copyDb as copy of originalDb), which guarantees a copy with transactional consistency, and then exporting from the non-changing database copy.

Error when exporting bacpac file from local database

It is best to use the latest GA version of SSMS, especially when targeting Azure SQL Database. Newer SSMS versions can be installed side-by-side with older versions. However, that is generally needed only when you need to support old SQL Server versions like SQL Server 2005. The latest GA SSMS version (as of this writing) supports the SQL Server 2008 through SQL Server 2017. The current preview version also supports SQL Server 2019.

Restoring database from Azure BACPAC gives an error

It's always best to use the latest SSMS version with Azure SQL Database to keep up with incremental changes. The problem is that you were using the older version of SSMS (17.3).

You can get the latest SSMS version from the menu (Tools--->Check for updates).

Azure Import Error: The internal target platform type SqlAzureV12DatabaseSchemaProvider does not support schema file version '3.1'

Finally figured out what happened. It's a specific case, but maybe it helps someone else.
We tried to use elasic query to write queries across databases. To do it you need to create database scoped credentials. When package was imported, it tried to do the same locally and failed executing this:

CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';

Since we decided to use different approach, I dropped scoped credentials and external data source (couldn't drop credentials without dropping data source):

DROP EXTERNAL DATA SOURCE Source
DROP DATABASE SCOPED CREDENTIAL Admin

Now everything is working again. Just be aware you cannot import database from Azure if it has scoped credentials created.



Related Topics



Leave a reply



Submit