How To Convert .dmp File Into .sql File

Posted : admin On 08.08.2019
  1. Dump File To Sql

How to Read contents of SQL server Dump file Hello, i have an issue as my SQL server database has unexpected behavior and stopped the services and logged a SQL Core Dump. I tried to open this core dump using windows debugging tool. I set the Symbol Path and tried to open the core dump but i got the following message. This dump file has an. I am required to provide a script file that will recreate an entire database including schema,users,objects,privileges. All of this is contained in my full export. Is it possible to create a sql script file that provides the same results as a full import from the.dmp file. MySQL dump file. OraDump-to-MySQL converter allows users to make indirect conversion and get more control over the process. Following this way, the program converts Oracle dump data into a local MySQL dump file instead of moving it to MySQL server directly.

By: Nat Sundar Last Updated: 2018-03-01 Comments (6) Related Tips: More >Import and Export

Problem

Most web applications are designed to exchange data in the JSON format. In addition, application logs are also available in JSON format. So, it is evident that we need to load JSON files into the database for analysis and reporting. In this tip, I will load sample JSON files into SQL Server.

Solution

Following are a couple examples of how to load JSON files into SQL Server.

Importing simple JSON file into SQL Server

In this example, the data file contains the order details such as 'OrderID', 'CustomerID' and 'OrderStatus' for 2 orders. The below image represents the supplied JSON data file to load data into SQL server.

Importing files using OPENROWSET

Openrowset is a table value function that can read data from a file. This function returns a table with a single column (Column Name: Bulk). With a simple syntax, this function will load the entire file content as a text value. Using this method, the entire file content can be loaded into a table or it can be stored in a variable for further processing.

The below script will load the contents of the Orders JSON file.

The actual contents of the JSON file can be read using the 'BulkColumn' and that can be stored in a variable. It is always recommended to validate the JSON with the ISJSON function, before trying to use the JSON data. This function will return 1 if it’s a valid JSON format.

The below script will read the Orders.JSON file and store its contents in a variable with the help of the 'BulkColumn' variable. In addition, this script will also validate the JSON using ISJSON function.

Using OpenJSON to query JSON

The OpenJSON function will help us to parse the JSON data content and transform into a relational result set. Once the JSON content has been transformed by the OpenJSON, then the result set can be used for further processing.

Apr 29, 2019 - PES 2019 Stadium Server 2019 for Sider 5 by Zlac. Be used via stadium server - adboards available via global ad-boards system (.cpk files. Ucl boards in stadium server peshawar. Sep 27, 2018 - PES 2019 Stadium UCL & UEL Boards by Buzzy. Added UEFA Champions League & UEFA Europa League boards to all stadium in Pro Evolution Soccer. PES 2019 Turf Mexico + Endo Stadium Server ( 100 Stadiums ). Dec 22, 2018 - Stadium Server for Sider by zlac (PES 2019 PC) 1.0.0. Be used via stadium server – adboards available via global ad-boards system (.cpk files. Will be assigned to UCL finals via mapcompetitions.txt file in stadium server.

The OpenJSON function accepts JSON as a parameter and it returns a dataset in two different formats:

  • Format #1: It returns the key:value pairs of the first level elements in the JSON.
  • Format #2: It returns all the elements with their indexes.

The OPENJSON function can be used in methods to query JSON dataset.

Method 1 - OPENJSON with the default output

The OpenJSON function can be called by default without passing additional arguments. In this case, it will return three columns. The first column represents the index of the element. The value of the element will be available as a second column. The third column indicates the data type of the value.

Method 2 - OPENJSON output with an explicit structure

The 'With Clause' in the OpenJSON function will allow you to define the schema for the output result set. The “With” Clause is optional in OpenJSON function. In the with clause, you can define a list of output columns, their types, and the paths of the JSON source properties for each output value.

OpenJSON iterates through the JSON object and reads the value on the specified path for each defined column, and generate the result set.

For each element in the JSON, OpenJSON generates a new row in the output table. If there are two elements in the JSON, then they will be converted into two rows in the returned result set.

In addition, the OpenJSON uses the column name, type, json_path syntax to read the value and convert into the specified type.

Compatibility Level for OPENJSON

The OpenJSON function is available only with the database compatibility level 130. If the database compatibility level is less than 130, SQL Server can’t use the OpenJSON function.

In the below example, the OpenJSON function is simply parsing the JSON content and returning two rows.

The above script can be expanded further by defining the column names with their datatype to generate a result set. The column definition will be mentioned in the WITH clause.

Now we are able to read and convert the JSON content into a meaningful result set. This result set can be inserted into a table or temporary table for further processing.

Now after evaluating this simple example, you should have basic understanding about loading and parsing the JSON file into SQL Server. Let’s challenge ourselves to learn more about JSON support in SQL Server 2016 by loading a common JSON file.

Loading UK Petition Details in JSON Format

In the UK, the parliament allows the general public to create an online petition and recommends everyone to vote for it. The details of the petition are available here to download in the form of a JSON file.

In this example, I have downloaded a sample petition JSON file and renamed as 'AllState.JSON'. Our aim is to load the content of the file and generate a result set for analysis and reporting.

Loading Petition JSON file

The below mentioned script will help you to load the downloaded JSON file into SQL Server.

It appears that the JSON file has two elements 'links' and 'data' at the high level. We need to query the 'links' element to understand the actual data within.

This can be achieved by passing additional path parameter to the OpenJSON function. The path has to have the prefix '$.' and the name of element you want to navigate.

The below script will help you to navigate the subset 'Links' and will display the contents.

Now we can navigate the subset 'data' by passing '$.data' as a path to the OpenJSON function as mentioned in the below script.

It is observed that the subset 'data' has all the details of every petition being made. However, the detail of a single petition is available as a nested element. So, to get the individual columns, we need to provide the column name and the datatype.

From the above script, it is observed that the column details such as type and ID have been listed successfully. However, the script hasn’t provided the column details for link and attributes.

Let’s have a quick look at the JSON file content and understand how the elements 'link' and attributes have been represented.

The below image represents the data and attributes subset of a single petition in the supplied JSON file.

It is noted that the elements 'links' and 'attributes' are further nested in the JSON. Hence it is mandatory to provide the complete path in the WITH clause of the OpenJSON function.

The below script will allow us to generate a result set for all the listed columns in the JSON.

From the above image it is observed that the root level elements can be referenced with the prefix '$.'. For example, the columns, type and id are represented in the root and they can be queried directly or optionally they can be represented with the prefix '$.'. As the column 'self' is a subset of link, it has to be represented as '$.links.self'. In the same way the columns action, background, state, signature_count and created_at are the subset of attributes element. Hence all these columns have been represented with the prefix '$.attributes.'

Summary

In this tip, we have learned about loading JSON files into SQL Server. In these examples, we have also learned about using OpenRowset and OpenJSON functions to manage JSON data effectively.

Next Steps
  • Read more about OPENJSON here
  • Learn JSON basics with this tip
  • Challenge your JSON knowledge with this tip

Last Updated: 2018-03-01



About the author
Nat Sundar is working as an independent SQL BI consultant in the UK with a Bachelors Degree in Engineering.
View all my tips


I want to ask about converting Oracle dump files (.dmp) into SQL Server files (.bak) where during conversion I don't have to be connected to any database server.

I've searched for related technologies, such as Oradump to SQL Server. Do you have another suggestion to solve this? Open source ones I mean.

Thanks for both of your response.I see how difficult it will, but is there any possibility to use another way in converting oracle dump file? because all of solution's converter tools always provide a connection database server. I'm so thankful what if you can suggest another tool. thanks anyway

Jack Douglas
29k12 gold badges82 silver badges158 bronze badges
syaloom

migrated from stackoverflow.comApr 16 '13 at 6:13

This question came from our site for professional and enthusiast programmers.

Dump File To Sql

6 Answers

'is there any possibility to use another way in converting oracle dump file?'

Import the dump file into Oracle. Transfer the data from Oracle to SQL Server using any number of tools (flat files, SSIS, linked servers, replication)

File

Forget about processing a dump file. The only thing that can process a dump file is Oracle. Or you can spend a year reverse engineering it then have to start over again when the format changes. If you want to convert Oracle data to SQL, you aren't going to be able to use a dump file unless you import into Oracle first.

Nick.McDermaidNick.McDermaid

How complex are your DB structures? Do you have basic tables with fields and primary keys? Do you have unique keys, foreign keys or composites, what about views? Do you have deeper logic objects like stored procedures and triggers? Oracle doesn't have that nice incremental parameter instead it has those nasty sequences. Apologies to Oracle fans.

In anycase more complex your database the more problematic your migration..I did find SQL Server Migration Assistant. http://blogs.msdn.com/b/ssma/ it might be something to look at.

Redgate has a webinar that might help you hit all the high points and provide some kind of solution: http://www.red-gate.com/products/oracle-development/deployment-suite-for-oracle/education/webinars/webinar-20-april-2011 The webinar covers SQLWays: http://www.ispirer.com/products.

I have not used these products personally and only skimmed the webinar, but I have been where you are and I know how touch and go this can be. Hopefully others will provide additional solutions for you..Good luck!

Frank TudorFrank Tudor

Although Oracle doesn't publish the format of dump files, the more recent ones created by Data Pump utility (expdp versus the older exp command) are in XML format. But I agree with earlier responses: it's probably not worth the R&D effort. Depending on the size, you may be able to import into the free version of Oracle called Express Edition (XE). It has an 11GB maximum limit but is a free download. Once it's imported, you can use SQL Server SSIS to connect to Oracle and transfer the data.

Bob WatkinsBob Watkins

You can simply use: OraDump to MSSQL, which is a powerful paid tool:OraDumpThis tool will simply read your dump file, create the database structure on SQLServer (Empty Database, or old existing with the same structure), then it will import all data to it.it was very helpful for me in fact.

IntoHow to restore .dmp file in sql server
Ahmad DekmakAhmad Dekmak

I consider this impossible, because:

  • Both formats have no public documentation available AFAIK.
  • The Oracle Dump is a logical format, all objects (Tables, Stores Procedures, Rows, etc.pp.) are contained seperated and can be imported individually by import options, e.g. only Tables, but no data.
  • The MS SQL BAK basically contains just the used blocks of the datafiles, not seperated at all into the different type of objects contained, and there you can restore a database file as a whole (including all tables, all data, all everything), not individual single elements.

So a tool capable of doing the requested would not only have to deal with the 'usual stuff' like different datatypes and SQL features with all the subtle differences, it would as well have to transform an apple into an orange formatwise ..

user22610user22610

I realize this post is over a year old, but there is a tool called OraDump-to-MSSQL by Intelligent Converters that will at least take an Oracle dump file and import it in to an MSSQL database. You do have to have the MSSQL server up and running but it will import in to a SQL Server Express database which is free to download and install. You can get the tool here http://www.convert-in.com/ord2mss.htm. It's $79 which is cheap.

GaryGary