Partial copy of huge database
I have started working in a project which has a big database, around 300GB. For security reasons I can not access database from my local web app. So I need to copy last 100,000 from each table.
To copy from a table to another, I know I can do:
INSERT INTO table2
SELECT * FROM table1
WHERE condition;
But how I can handle connecting to the other database?
One idea I have, it is to create a table, same structure, and use query above to move records and then dump those tables.
Is there a better way?
mysql migration export
bumped to the homepage by Community♦ 7 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
add a comment |
I have started working in a project which has a big database, around 300GB. For security reasons I can not access database from my local web app. So I need to copy last 100,000 from each table.
To copy from a table to another, I know I can do:
INSERT INTO table2
SELECT * FROM table1
WHERE condition;
But how I can handle connecting to the other database?
One idea I have, it is to create a table, same structure, and use query above to move records and then dump those tables.
Is there a better way?
mysql migration export
bumped to the homepage by Community♦ 7 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
add a comment |
I have started working in a project which has a big database, around 300GB. For security reasons I can not access database from my local web app. So I need to copy last 100,000 from each table.
To copy from a table to another, I know I can do:
INSERT INTO table2
SELECT * FROM table1
WHERE condition;
But how I can handle connecting to the other database?
One idea I have, it is to create a table, same structure, and use query above to move records and then dump those tables.
Is there a better way?
mysql migration export
I have started working in a project which has a big database, around 300GB. For security reasons I can not access database from my local web app. So I need to copy last 100,000 from each table.
To copy from a table to another, I know I can do:
INSERT INTO table2
SELECT * FROM table1
WHERE condition;
But how I can handle connecting to the other database?
One idea I have, it is to create a table, same structure, and use query above to move records and then dump those tables.
Is there a better way?
mysql migration export
mysql migration export
asked Feb 5 '18 at 12:55
EduardoEduardo
1216
1216
bumped to the homepage by Community♦ 7 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
bumped to the homepage by Community♦ 7 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
Copy the create script for those tables an copy onlyt the rows that you want
mysqldump --opt --user=username --password=password database table --limit 1000 > file.sql
At the new server create the table and charge
mysqldump --opt --user=username --password=password database new_table --limit 1000 < file.sql
I think it's the best way for your
add a comment |
mysqldump -h source_server... --order-by-primary --where='id > ...' src_db, tbl |
mysql -h dest_server ...
But that assumes
id
is thePRIMARY KEY
- You can get the
id
100K (or so) rows from the end:SELECT id FROM tbl ORDER BY id DESC LIMIT 100000, 1
- You can access both servers from wherever you run the pipeline.
Since you need 2 connections, there is no 'simple' way to do it from the mysql commandline tool or from client code (PHP, Java, etc). Copying one row at a time would quite slow. Employing LOAD DATA
will be no better than mysqldump
+ mysql
.
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "182"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f197081%2fpartial-copy-of-huge-database%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
Copy the create script for those tables an copy onlyt the rows that you want
mysqldump --opt --user=username --password=password database table --limit 1000 > file.sql
At the new server create the table and charge
mysqldump --opt --user=username --password=password database new_table --limit 1000 < file.sql
I think it's the best way for your
add a comment |
Copy the create script for those tables an copy onlyt the rows that you want
mysqldump --opt --user=username --password=password database table --limit 1000 > file.sql
At the new server create the table and charge
mysqldump --opt --user=username --password=password database new_table --limit 1000 < file.sql
I think it's the best way for your
add a comment |
Copy the create script for those tables an copy onlyt the rows that you want
mysqldump --opt --user=username --password=password database table --limit 1000 > file.sql
At the new server create the table and charge
mysqldump --opt --user=username --password=password database new_table --limit 1000 < file.sql
I think it's the best way for your
Copy the create script for those tables an copy onlyt the rows that you want
mysqldump --opt --user=username --password=password database table --limit 1000 > file.sql
At the new server create the table and charge
mysqldump --opt --user=username --password=password database new_table --limit 1000 < file.sql
I think it's the best way for your
answered Feb 5 '18 at 13:49
KrismorteKrismorte
399112
399112
add a comment |
add a comment |
mysqldump -h source_server... --order-by-primary --where='id > ...' src_db, tbl |
mysql -h dest_server ...
But that assumes
id
is thePRIMARY KEY
- You can get the
id
100K (or so) rows from the end:SELECT id FROM tbl ORDER BY id DESC LIMIT 100000, 1
- You can access both servers from wherever you run the pipeline.
Since you need 2 connections, there is no 'simple' way to do it from the mysql commandline tool or from client code (PHP, Java, etc). Copying one row at a time would quite slow. Employing LOAD DATA
will be no better than mysqldump
+ mysql
.
add a comment |
mysqldump -h source_server... --order-by-primary --where='id > ...' src_db, tbl |
mysql -h dest_server ...
But that assumes
id
is thePRIMARY KEY
- You can get the
id
100K (or so) rows from the end:SELECT id FROM tbl ORDER BY id DESC LIMIT 100000, 1
- You can access both servers from wherever you run the pipeline.
Since you need 2 connections, there is no 'simple' way to do it from the mysql commandline tool or from client code (PHP, Java, etc). Copying one row at a time would quite slow. Employing LOAD DATA
will be no better than mysqldump
+ mysql
.
add a comment |
mysqldump -h source_server... --order-by-primary --where='id > ...' src_db, tbl |
mysql -h dest_server ...
But that assumes
id
is thePRIMARY KEY
- You can get the
id
100K (or so) rows from the end:SELECT id FROM tbl ORDER BY id DESC LIMIT 100000, 1
- You can access both servers from wherever you run the pipeline.
Since you need 2 connections, there is no 'simple' way to do it from the mysql commandline tool or from client code (PHP, Java, etc). Copying one row at a time would quite slow. Employing LOAD DATA
will be no better than mysqldump
+ mysql
.
mysqldump -h source_server... --order-by-primary --where='id > ...' src_db, tbl |
mysql -h dest_server ...
But that assumes
id
is thePRIMARY KEY
- You can get the
id
100K (or so) rows from the end:SELECT id FROM tbl ORDER BY id DESC LIMIT 100000, 1
- You can access both servers from wherever you run the pipeline.
Since you need 2 connections, there is no 'simple' way to do it from the mysql commandline tool or from client code (PHP, Java, etc). Copying one row at a time would quite slow. Employing LOAD DATA
will be no better than mysqldump
+ mysql
.
answered Feb 12 '18 at 21:30
Rick JamesRick James
42.6k22258
42.6k22258
add a comment |
add a comment |
Thanks for contributing an answer to Database Administrators Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f197081%2fpartial-copy-of-huge-database%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown