MySQL Database import into 5.7 hangs with CLI or with Workbench 6.3 CE
I have a SQL dump file that I exported using Workbench from a server that is being decommissioned. The host and target OS is Microsoft Windows Server 2008 R2 (therefore is x64 natively). There seems to be no way in hell to import this SQL dump file as it will hang no matter what. File size is less than 500Mb. I did look at the database schema itself and tables seem to have overwritten the initially created DB with the same name, however the import will not do anything but hang. I'm not a DBA, just a network/sysadmin so I am most frustrated with getting this thing to import. Any suggestions anyone?
mysql mysqldump import mysql-5.7
add a comment |
I have a SQL dump file that I exported using Workbench from a server that is being decommissioned. The host and target OS is Microsoft Windows Server 2008 R2 (therefore is x64 natively). There seems to be no way in hell to import this SQL dump file as it will hang no matter what. File size is less than 500Mb. I did look at the database schema itself and tables seem to have overwritten the initially created DB with the same name, however the import will not do anything but hang. I'm not a DBA, just a network/sysadmin so I am most frustrated with getting this thing to import. Any suggestions anyone?
mysql mysqldump import mysql-5.7
First, deep breath... :-). 500MB is small these days. Second, have you "eye-balled" your data? Does the script look OK? Can you split it into separate tables? How was the "dump" taken? Answer these and we'll take it from there. p.s. welcome to the forum.
– Vérace
Mar 7 '16 at 18:59
Try to look in the error log of MySQL for possible cause.
– PullupSkrr
Dec 28 '18 at 6:38
add a comment |
I have a SQL dump file that I exported using Workbench from a server that is being decommissioned. The host and target OS is Microsoft Windows Server 2008 R2 (therefore is x64 natively). There seems to be no way in hell to import this SQL dump file as it will hang no matter what. File size is less than 500Mb. I did look at the database schema itself and tables seem to have overwritten the initially created DB with the same name, however the import will not do anything but hang. I'm not a DBA, just a network/sysadmin so I am most frustrated with getting this thing to import. Any suggestions anyone?
mysql mysqldump import mysql-5.7
I have a SQL dump file that I exported using Workbench from a server that is being decommissioned. The host and target OS is Microsoft Windows Server 2008 R2 (therefore is x64 natively). There seems to be no way in hell to import this SQL dump file as it will hang no matter what. File size is less than 500Mb. I did look at the database schema itself and tables seem to have overwritten the initially created DB with the same name, however the import will not do anything but hang. I'm not a DBA, just a network/sysadmin so I am most frustrated with getting this thing to import. Any suggestions anyone?
mysql mysqldump import mysql-5.7
mysql mysqldump import mysql-5.7
asked Mar 7 '16 at 18:10
BrianBrian
162
162
First, deep breath... :-). 500MB is small these days. Second, have you "eye-balled" your data? Does the script look OK? Can you split it into separate tables? How was the "dump" taken? Answer these and we'll take it from there. p.s. welcome to the forum.
– Vérace
Mar 7 '16 at 18:59
Try to look in the error log of MySQL for possible cause.
– PullupSkrr
Dec 28 '18 at 6:38
add a comment |
First, deep breath... :-). 500MB is small these days. Second, have you "eye-balled" your data? Does the script look OK? Can you split it into separate tables? How was the "dump" taken? Answer these and we'll take it from there. p.s. welcome to the forum.
– Vérace
Mar 7 '16 at 18:59
Try to look in the error log of MySQL for possible cause.
– PullupSkrr
Dec 28 '18 at 6:38
First, deep breath... :-). 500MB is small these days. Second, have you "eye-balled" your data? Does the script look OK? Can you split it into separate tables? How was the "dump" taken? Answer these and we'll take it from there. p.s. welcome to the forum.
– Vérace
Mar 7 '16 at 18:59
First, deep breath... :-). 500MB is small these days. Second, have you "eye-balled" your data? Does the script look OK? Can you split it into separate tables? How was the "dump" taken? Answer these and we'll take it from there. p.s. welcome to the forum.
– Vérace
Mar 7 '16 at 18:59
Try to look in the error log of MySQL for possible cause.
– PullupSkrr
Dec 28 '18 at 6:38
Try to look in the error log of MySQL for possible cause.
– PullupSkrr
Dec 28 '18 at 6:38
add a comment |
3 Answers
3
active
oldest
votes
Load it in using the mysql commandline took instead of Workbench.
add a comment |
I assume the actual problem is you don't know the progress percentage of import process. That's why it appears to be hanging while import is progressing in the background.
Following details were copied from this how-can-i-monitor-the-progress-of-an-import-of-a-large-sql-file accepted answer
If you're just importing from a dump file from the CLI on *nix, e.g.
mysql -uxxx -pxxx dbname < /sqlfile.sql
then first install pipe viewer on your OS then try something like this:
pv sqlfile.sql | mysql -uxxx -pxxxx dbname
which will show a progress bar as the program runs.
It's very useful and you can also use it to get an estimate for mysqldump progress.
pv dumps the sqlfile.sql
and passes them to mysql (because of the pipe operator). While it is dumping, it shows the progress. The cool thing is that mysql takes the data only as fast as it can progress it, so pv can show the progress of the import. I do not have any proof. But it seems so. I guess there is some buffer used, but at some point I think mysql
does not read any more data when it is still busy processing.
It does seem to hang from time to time though with large exported SQL files. I have the same problem and it's easy to tell that it hangs doing nothing because the actual db files that get created stop increasing in size on the larger tables that need to be imported
– georaldc
Feb 16 '17 at 19:53
add a comment |
I ran into similar problem importing a 1.2G SQL File and it took about 10 hours on my Laptop (Core i-5 x64 8G RAM) Of course I was multitasking.
To find out the progress, I'd query every hour using this query
select table_name as "Table", round(((data_length + index_length) / 1024 / 1024), 2) AS "SIZE" from information_schema.TABLES where table_schema = "Name_of_Database_You_Import";
Thanks
New contributor
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "182"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f131523%2fmysql-database-import-into-5-7-hangs-with-cli-or-with-workbench-6-3-ce%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
Load it in using the mysql commandline took instead of Workbench.
add a comment |
Load it in using the mysql commandline took instead of Workbench.
add a comment |
Load it in using the mysql commandline took instead of Workbench.
Load it in using the mysql commandline took instead of Workbench.
answered Mar 7 '16 at 20:30
Rick JamesRick James
41.5k22258
41.5k22258
add a comment |
add a comment |
I assume the actual problem is you don't know the progress percentage of import process. That's why it appears to be hanging while import is progressing in the background.
Following details were copied from this how-can-i-monitor-the-progress-of-an-import-of-a-large-sql-file accepted answer
If you're just importing from a dump file from the CLI on *nix, e.g.
mysql -uxxx -pxxx dbname < /sqlfile.sql
then first install pipe viewer on your OS then try something like this:
pv sqlfile.sql | mysql -uxxx -pxxxx dbname
which will show a progress bar as the program runs.
It's very useful and you can also use it to get an estimate for mysqldump progress.
pv dumps the sqlfile.sql
and passes them to mysql (because of the pipe operator). While it is dumping, it shows the progress. The cool thing is that mysql takes the data only as fast as it can progress it, so pv can show the progress of the import. I do not have any proof. But it seems so. I guess there is some buffer used, but at some point I think mysql
does not read any more data when it is still busy processing.
It does seem to hang from time to time though with large exported SQL files. I have the same problem and it's easy to tell that it hangs doing nothing because the actual db files that get created stop increasing in size on the larger tables that need to be imported
– georaldc
Feb 16 '17 at 19:53
add a comment |
I assume the actual problem is you don't know the progress percentage of import process. That's why it appears to be hanging while import is progressing in the background.
Following details were copied from this how-can-i-monitor-the-progress-of-an-import-of-a-large-sql-file accepted answer
If you're just importing from a dump file from the CLI on *nix, e.g.
mysql -uxxx -pxxx dbname < /sqlfile.sql
then first install pipe viewer on your OS then try something like this:
pv sqlfile.sql | mysql -uxxx -pxxxx dbname
which will show a progress bar as the program runs.
It's very useful and you can also use it to get an estimate for mysqldump progress.
pv dumps the sqlfile.sql
and passes them to mysql (because of the pipe operator). While it is dumping, it shows the progress. The cool thing is that mysql takes the data only as fast as it can progress it, so pv can show the progress of the import. I do not have any proof. But it seems so. I guess there is some buffer used, but at some point I think mysql
does not read any more data when it is still busy processing.
It does seem to hang from time to time though with large exported SQL files. I have the same problem and it's easy to tell that it hangs doing nothing because the actual db files that get created stop increasing in size on the larger tables that need to be imported
– georaldc
Feb 16 '17 at 19:53
add a comment |
I assume the actual problem is you don't know the progress percentage of import process. That's why it appears to be hanging while import is progressing in the background.
Following details were copied from this how-can-i-monitor-the-progress-of-an-import-of-a-large-sql-file accepted answer
If you're just importing from a dump file from the CLI on *nix, e.g.
mysql -uxxx -pxxx dbname < /sqlfile.sql
then first install pipe viewer on your OS then try something like this:
pv sqlfile.sql | mysql -uxxx -pxxxx dbname
which will show a progress bar as the program runs.
It's very useful and you can also use it to get an estimate for mysqldump progress.
pv dumps the sqlfile.sql
and passes them to mysql (because of the pipe operator). While it is dumping, it shows the progress. The cool thing is that mysql takes the data only as fast as it can progress it, so pv can show the progress of the import. I do not have any proof. But it seems so. I guess there is some buffer used, but at some point I think mysql
does not read any more data when it is still busy processing.
I assume the actual problem is you don't know the progress percentage of import process. That's why it appears to be hanging while import is progressing in the background.
Following details were copied from this how-can-i-monitor-the-progress-of-an-import-of-a-large-sql-file accepted answer
If you're just importing from a dump file from the CLI on *nix, e.g.
mysql -uxxx -pxxx dbname < /sqlfile.sql
then first install pipe viewer on your OS then try something like this:
pv sqlfile.sql | mysql -uxxx -pxxxx dbname
which will show a progress bar as the program runs.
It's very useful and you can also use it to get an estimate for mysqldump progress.
pv dumps the sqlfile.sql
and passes them to mysql (because of the pipe operator). While it is dumping, it shows the progress. The cool thing is that mysql takes the data only as fast as it can progress it, so pv can show the progress of the import. I do not have any proof. But it seems so. I guess there is some buffer used, but at some point I think mysql
does not read any more data when it is still busy processing.
edited Apr 13 '17 at 12:42
Community♦
1
1
answered Jan 20 '17 at 7:52
The CoderThe Coder
1917
1917
It does seem to hang from time to time though with large exported SQL files. I have the same problem and it's easy to tell that it hangs doing nothing because the actual db files that get created stop increasing in size on the larger tables that need to be imported
– georaldc
Feb 16 '17 at 19:53
add a comment |
It does seem to hang from time to time though with large exported SQL files. I have the same problem and it's easy to tell that it hangs doing nothing because the actual db files that get created stop increasing in size on the larger tables that need to be imported
– georaldc
Feb 16 '17 at 19:53
It does seem to hang from time to time though with large exported SQL files. I have the same problem and it's easy to tell that it hangs doing nothing because the actual db files that get created stop increasing in size on the larger tables that need to be imported
– georaldc
Feb 16 '17 at 19:53
It does seem to hang from time to time though with large exported SQL files. I have the same problem and it's easy to tell that it hangs doing nothing because the actual db files that get created stop increasing in size on the larger tables that need to be imported
– georaldc
Feb 16 '17 at 19:53
add a comment |
I ran into similar problem importing a 1.2G SQL File and it took about 10 hours on my Laptop (Core i-5 x64 8G RAM) Of course I was multitasking.
To find out the progress, I'd query every hour using this query
select table_name as "Table", round(((data_length + index_length) / 1024 / 1024), 2) AS "SIZE" from information_schema.TABLES where table_schema = "Name_of_Database_You_Import";
Thanks
New contributor
add a comment |
I ran into similar problem importing a 1.2G SQL File and it took about 10 hours on my Laptop (Core i-5 x64 8G RAM) Of course I was multitasking.
To find out the progress, I'd query every hour using this query
select table_name as "Table", round(((data_length + index_length) / 1024 / 1024), 2) AS "SIZE" from information_schema.TABLES where table_schema = "Name_of_Database_You_Import";
Thanks
New contributor
add a comment |
I ran into similar problem importing a 1.2G SQL File and it took about 10 hours on my Laptop (Core i-5 x64 8G RAM) Of course I was multitasking.
To find out the progress, I'd query every hour using this query
select table_name as "Table", round(((data_length + index_length) / 1024 / 1024), 2) AS "SIZE" from information_schema.TABLES where table_schema = "Name_of_Database_You_Import";
Thanks
New contributor
I ran into similar problem importing a 1.2G SQL File and it took about 10 hours on my Laptop (Core i-5 x64 8G RAM) Of course I was multitasking.
To find out the progress, I'd query every hour using this query
select table_name as "Table", round(((data_length + index_length) / 1024 / 1024), 2) AS "SIZE" from information_schema.TABLES where table_schema = "Name_of_Database_You_Import";
Thanks
New contributor
New contributor
answered 28 mins ago
ChetanChetan
1
1
New contributor
New contributor
add a comment |
add a comment |
Thanks for contributing an answer to Database Administrators Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f131523%2fmysql-database-import-into-5-7-hangs-with-cli-or-with-workbench-6-3-ce%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
First, deep breath... :-). 500MB is small these days. Second, have you "eye-balled" your data? Does the script look OK? Can you split it into separate tables? How was the "dump" taken? Answer these and we'll take it from there. p.s. welcome to the forum.
– Vérace
Mar 7 '16 at 18:59
Try to look in the error log of MySQL for possible cause.
– PullupSkrr
Dec 28 '18 at 6:38