Move new data records automatically to another database in oracle xe
What's the best way to move all the data from a table on my local oracle database to an identical table on another pc? I will have to transfer the data to the second database as soon as it's written to the local one. The local table should not contain any data, unless it's not possible to insert it to the remote database. If the connection is lost, all new datasets must be stored in the local table.
I tried to solve this by using a trigger, but it didn't work as expected. It works just fine if the remote database connection is valid, but it performs an entire rollback (including the original insert) if the connection is lost. Because of that, the data isnt even inserted to the local database. Another huge problem is that it takes about 40 seconds every time to return ORA-12170 (Connection timeout). Is there any way to set a much shorter time interval for the timeout or to abort the query if it takes that much time?
create or replace TRIGGER DATA_TO_SERVER
AFTER INSERT ON LOCAL_TABLE
DECLARE
PRAGMA AUTONOMOUS_TRANSACTION;
BEGIN
SAVEPOINT sp;
INSERT INTO SERVER.SERVER_TABLE@SERVER_LINK
SELECT *
FROM LOCAL_TABLE;
DELETE
FROM LOCAL_TABLE
WHERE ID IS NOT NULL;
COMMIT;
EXCEPTION WHEN OTHERS THEN
ROLLBACK to sp;
RAISE;
END;
oracle trigger
bumped to the homepage by Community♦ 22 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
add a comment |
What's the best way to move all the data from a table on my local oracle database to an identical table on another pc? I will have to transfer the data to the second database as soon as it's written to the local one. The local table should not contain any data, unless it's not possible to insert it to the remote database. If the connection is lost, all new datasets must be stored in the local table.
I tried to solve this by using a trigger, but it didn't work as expected. It works just fine if the remote database connection is valid, but it performs an entire rollback (including the original insert) if the connection is lost. Because of that, the data isnt even inserted to the local database. Another huge problem is that it takes about 40 seconds every time to return ORA-12170 (Connection timeout). Is there any way to set a much shorter time interval for the timeout or to abort the query if it takes that much time?
create or replace TRIGGER DATA_TO_SERVER
AFTER INSERT ON LOCAL_TABLE
DECLARE
PRAGMA AUTONOMOUS_TRANSACTION;
BEGIN
SAVEPOINT sp;
INSERT INTO SERVER.SERVER_TABLE@SERVER_LINK
SELECT *
FROM LOCAL_TABLE;
DELETE
FROM LOCAL_TABLE
WHERE ID IS NOT NULL;
COMMIT;
EXCEPTION WHEN OTHERS THEN
ROLLBACK to sp;
RAISE;
END;
oracle trigger
bumped to the homepage by Community♦ 22 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
Use DataPump
– a_horse_with_no_name
Feb 19 '18 at 12:39
The best way is to use export and import, you can also customize according to object type, schema and many other options.
– Husam Mohamed
Feb 19 '18 at 12:41
What do you want to achieve with this complicated mechanism? Do you want that the data is stored in a local table if the remote database cannot be reached?
– miracle173
Feb 19 '18 at 15:32
@miracle173 Yes, thats exactly what i want to achieve. I also thought of using a c# application, but i wasn't able to delete the data records that have already been moved to the server.
– Gandalf The Gay
Feb 20 '18 at 6:10
add a comment |
What's the best way to move all the data from a table on my local oracle database to an identical table on another pc? I will have to transfer the data to the second database as soon as it's written to the local one. The local table should not contain any data, unless it's not possible to insert it to the remote database. If the connection is lost, all new datasets must be stored in the local table.
I tried to solve this by using a trigger, but it didn't work as expected. It works just fine if the remote database connection is valid, but it performs an entire rollback (including the original insert) if the connection is lost. Because of that, the data isnt even inserted to the local database. Another huge problem is that it takes about 40 seconds every time to return ORA-12170 (Connection timeout). Is there any way to set a much shorter time interval for the timeout or to abort the query if it takes that much time?
create or replace TRIGGER DATA_TO_SERVER
AFTER INSERT ON LOCAL_TABLE
DECLARE
PRAGMA AUTONOMOUS_TRANSACTION;
BEGIN
SAVEPOINT sp;
INSERT INTO SERVER.SERVER_TABLE@SERVER_LINK
SELECT *
FROM LOCAL_TABLE;
DELETE
FROM LOCAL_TABLE
WHERE ID IS NOT NULL;
COMMIT;
EXCEPTION WHEN OTHERS THEN
ROLLBACK to sp;
RAISE;
END;
oracle trigger
What's the best way to move all the data from a table on my local oracle database to an identical table on another pc? I will have to transfer the data to the second database as soon as it's written to the local one. The local table should not contain any data, unless it's not possible to insert it to the remote database. If the connection is lost, all new datasets must be stored in the local table.
I tried to solve this by using a trigger, but it didn't work as expected. It works just fine if the remote database connection is valid, but it performs an entire rollback (including the original insert) if the connection is lost. Because of that, the data isnt even inserted to the local database. Another huge problem is that it takes about 40 seconds every time to return ORA-12170 (Connection timeout). Is there any way to set a much shorter time interval for the timeout or to abort the query if it takes that much time?
create or replace TRIGGER DATA_TO_SERVER
AFTER INSERT ON LOCAL_TABLE
DECLARE
PRAGMA AUTONOMOUS_TRANSACTION;
BEGIN
SAVEPOINT sp;
INSERT INTO SERVER.SERVER_TABLE@SERVER_LINK
SELECT *
FROM LOCAL_TABLE;
DELETE
FROM LOCAL_TABLE
WHERE ID IS NOT NULL;
COMMIT;
EXCEPTION WHEN OTHERS THEN
ROLLBACK to sp;
RAISE;
END;
oracle trigger
oracle trigger
edited Feb 19 '18 at 13:17
Gandalf The Gay
asked Feb 19 '18 at 12:38
Gandalf The GayGandalf The Gay
63
63
bumped to the homepage by Community♦ 22 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
bumped to the homepage by Community♦ 22 mins ago
This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
Use DataPump
– a_horse_with_no_name
Feb 19 '18 at 12:39
The best way is to use export and import, you can also customize according to object type, schema and many other options.
– Husam Mohamed
Feb 19 '18 at 12:41
What do you want to achieve with this complicated mechanism? Do you want that the data is stored in a local table if the remote database cannot be reached?
– miracle173
Feb 19 '18 at 15:32
@miracle173 Yes, thats exactly what i want to achieve. I also thought of using a c# application, but i wasn't able to delete the data records that have already been moved to the server.
– Gandalf The Gay
Feb 20 '18 at 6:10
add a comment |
Use DataPump
– a_horse_with_no_name
Feb 19 '18 at 12:39
The best way is to use export and import, you can also customize according to object type, schema and many other options.
– Husam Mohamed
Feb 19 '18 at 12:41
What do you want to achieve with this complicated mechanism? Do you want that the data is stored in a local table if the remote database cannot be reached?
– miracle173
Feb 19 '18 at 15:32
@miracle173 Yes, thats exactly what i want to achieve. I also thought of using a c# application, but i wasn't able to delete the data records that have already been moved to the server.
– Gandalf The Gay
Feb 20 '18 at 6:10
Use DataPump
– a_horse_with_no_name
Feb 19 '18 at 12:39
Use DataPump
– a_horse_with_no_name
Feb 19 '18 at 12:39
The best way is to use export and import, you can also customize according to object type, schema and many other options.
– Husam Mohamed
Feb 19 '18 at 12:41
The best way is to use export and import, you can also customize according to object type, schema and many other options.
– Husam Mohamed
Feb 19 '18 at 12:41
What do you want to achieve with this complicated mechanism? Do you want that the data is stored in a local table if the remote database cannot be reached?
– miracle173
Feb 19 '18 at 15:32
What do you want to achieve with this complicated mechanism? Do you want that the data is stored in a local table if the remote database cannot be reached?
– miracle173
Feb 19 '18 at 15:32
@miracle173 Yes, thats exactly what i want to achieve. I also thought of using a c# application, but i wasn't able to delete the data records that have already been moved to the server.
– Gandalf The Gay
Feb 20 '18 at 6:10
@miracle173 Yes, thats exactly what i want to achieve. I also thought of using a c# application, but i wasn't able to delete the data records that have already been moved to the server.
– Gandalf The Gay
Feb 20 '18 at 6:10
add a comment |
3 Answers
3
active
oldest
votes
SQL> Create Directory EXPDIR as 'PATH';
The export command
expdp directory=expdir dumpfile=newExport.dmp full=y logfile=newExport.log
on the target database also create a directory
Create Directory EXPDIR as 'PATH';
The import command
IMPDP directory=ExpDir dumpfile=newExport.dmp Logfile=import_Database.LOG FULL=Y
By the way this is a full export and import, you can customize it according to your preference on what exactly to export or import
https://docs.oracle.com/cloud/latest/db112/SUTIL/dp_export.htm#SUTIL200
Wouldn't using a datadump require me to export and import the entire database manually? Both databases have only one table in common and i only need to move a view newly inserted data records per second from one database to the other automatically.
– Gandalf The Gay
Feb 19 '18 at 13:09
That's true, although if your data wasn't much then it wouldn't take time. But for your case as a repetitive task might not be very practical. Perhaps your current way is fine using trigger and DB-link but needs some tuning to work with better performance.
– Husam Mohamed
Feb 20 '18 at 7:37
add a comment |
The only solution I see (I'm very myopic) is creating a view on that table, inserts happen into that view, and "instead-of" triggers handling those inserts. If inserting into the remote table fails, insert locally.
add a comment |
I would use a view with instead_of trigger as GerhardH.Pille proposed. In my environment I cannot use database links so I use a local table called remotetab. Replace it by remotetab@remotedb if you have a database remotedb. In my example the length of the second columns of the local and the remote table differ. But this is only to simulate an error when trying to insert in the remote table.
drop table localtab;
drop table remotetab;
drop view localview;
create table localtab(
id number primary key,
name varchar2(30) not null)
/
create table remotetab(
id number primary key,
name varchar2(3) not null)
/
create view localview(id, name)
as
select id, name
from localtab
/
create trigger localtrig
instead of insert on localview
begin
insert into remotetab(id,name) values(:new.id, :new.name);
exception
when others then
insert into localtab(id,name) values(:new.id, :new.name);
end;
/
insert into localview(id, name) values(1,'a');
insert into localview(id, name) values(2,'hallo');
insert into localview(id, name) values('x','y');
select id, name remote from remotetab;
select id, name local from localtab;
The inserts return
1 row(s) inserted.
1 row(s) inserted.
ORA-01722: invalid number ORA-06512: at "SYS.DBMS_SQL", line 1721
and the selects return
id local
1 'a'
id remote
2 'hallo'
as expected.
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "182"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f198257%2fmove-new-data-records-automatically-to-another-database-in-oracle-xe%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
SQL> Create Directory EXPDIR as 'PATH';
The export command
expdp directory=expdir dumpfile=newExport.dmp full=y logfile=newExport.log
on the target database also create a directory
Create Directory EXPDIR as 'PATH';
The import command
IMPDP directory=ExpDir dumpfile=newExport.dmp Logfile=import_Database.LOG FULL=Y
By the way this is a full export and import, you can customize it according to your preference on what exactly to export or import
https://docs.oracle.com/cloud/latest/db112/SUTIL/dp_export.htm#SUTIL200
Wouldn't using a datadump require me to export and import the entire database manually? Both databases have only one table in common and i only need to move a view newly inserted data records per second from one database to the other automatically.
– Gandalf The Gay
Feb 19 '18 at 13:09
That's true, although if your data wasn't much then it wouldn't take time. But for your case as a repetitive task might not be very practical. Perhaps your current way is fine using trigger and DB-link but needs some tuning to work with better performance.
– Husam Mohamed
Feb 20 '18 at 7:37
add a comment |
SQL> Create Directory EXPDIR as 'PATH';
The export command
expdp directory=expdir dumpfile=newExport.dmp full=y logfile=newExport.log
on the target database also create a directory
Create Directory EXPDIR as 'PATH';
The import command
IMPDP directory=ExpDir dumpfile=newExport.dmp Logfile=import_Database.LOG FULL=Y
By the way this is a full export and import, you can customize it according to your preference on what exactly to export or import
https://docs.oracle.com/cloud/latest/db112/SUTIL/dp_export.htm#SUTIL200
Wouldn't using a datadump require me to export and import the entire database manually? Both databases have only one table in common and i only need to move a view newly inserted data records per second from one database to the other automatically.
– Gandalf The Gay
Feb 19 '18 at 13:09
That's true, although if your data wasn't much then it wouldn't take time. But for your case as a repetitive task might not be very practical. Perhaps your current way is fine using trigger and DB-link but needs some tuning to work with better performance.
– Husam Mohamed
Feb 20 '18 at 7:37
add a comment |
SQL> Create Directory EXPDIR as 'PATH';
The export command
expdp directory=expdir dumpfile=newExport.dmp full=y logfile=newExport.log
on the target database also create a directory
Create Directory EXPDIR as 'PATH';
The import command
IMPDP directory=ExpDir dumpfile=newExport.dmp Logfile=import_Database.LOG FULL=Y
By the way this is a full export and import, you can customize it according to your preference on what exactly to export or import
https://docs.oracle.com/cloud/latest/db112/SUTIL/dp_export.htm#SUTIL200
SQL> Create Directory EXPDIR as 'PATH';
The export command
expdp directory=expdir dumpfile=newExport.dmp full=y logfile=newExport.log
on the target database also create a directory
Create Directory EXPDIR as 'PATH';
The import command
IMPDP directory=ExpDir dumpfile=newExport.dmp Logfile=import_Database.LOG FULL=Y
By the way this is a full export and import, you can customize it according to your preference on what exactly to export or import
https://docs.oracle.com/cloud/latest/db112/SUTIL/dp_export.htm#SUTIL200
answered Feb 19 '18 at 12:46
Husam MohamedHusam Mohamed
4161313
4161313
Wouldn't using a datadump require me to export and import the entire database manually? Both databases have only one table in common and i only need to move a view newly inserted data records per second from one database to the other automatically.
– Gandalf The Gay
Feb 19 '18 at 13:09
That's true, although if your data wasn't much then it wouldn't take time. But for your case as a repetitive task might not be very practical. Perhaps your current way is fine using trigger and DB-link but needs some tuning to work with better performance.
– Husam Mohamed
Feb 20 '18 at 7:37
add a comment |
Wouldn't using a datadump require me to export and import the entire database manually? Both databases have only one table in common and i only need to move a view newly inserted data records per second from one database to the other automatically.
– Gandalf The Gay
Feb 19 '18 at 13:09
That's true, although if your data wasn't much then it wouldn't take time. But for your case as a repetitive task might not be very practical. Perhaps your current way is fine using trigger and DB-link but needs some tuning to work with better performance.
– Husam Mohamed
Feb 20 '18 at 7:37
Wouldn't using a datadump require me to export and import the entire database manually? Both databases have only one table in common and i only need to move a view newly inserted data records per second from one database to the other automatically.
– Gandalf The Gay
Feb 19 '18 at 13:09
Wouldn't using a datadump require me to export and import the entire database manually? Both databases have only one table in common and i only need to move a view newly inserted data records per second from one database to the other automatically.
– Gandalf The Gay
Feb 19 '18 at 13:09
That's true, although if your data wasn't much then it wouldn't take time. But for your case as a repetitive task might not be very practical. Perhaps your current way is fine using trigger and DB-link but needs some tuning to work with better performance.
– Husam Mohamed
Feb 20 '18 at 7:37
That's true, although if your data wasn't much then it wouldn't take time. But for your case as a repetitive task might not be very practical. Perhaps your current way is fine using trigger and DB-link but needs some tuning to work with better performance.
– Husam Mohamed
Feb 20 '18 at 7:37
add a comment |
The only solution I see (I'm very myopic) is creating a view on that table, inserts happen into that view, and "instead-of" triggers handling those inserts. If inserting into the remote table fails, insert locally.
add a comment |
The only solution I see (I'm very myopic) is creating a view on that table, inserts happen into that view, and "instead-of" triggers handling those inserts. If inserting into the remote table fails, insert locally.
add a comment |
The only solution I see (I'm very myopic) is creating a view on that table, inserts happen into that view, and "instead-of" triggers handling those inserts. If inserting into the remote table fails, insert locally.
The only solution I see (I'm very myopic) is creating a view on that table, inserts happen into that view, and "instead-of" triggers handling those inserts. If inserting into the remote table fails, insert locally.
answered Feb 19 '18 at 14:35
Gerard H. PilleGerard H. Pille
1,307128
1,307128
add a comment |
add a comment |
I would use a view with instead_of trigger as GerhardH.Pille proposed. In my environment I cannot use database links so I use a local table called remotetab. Replace it by remotetab@remotedb if you have a database remotedb. In my example the length of the second columns of the local and the remote table differ. But this is only to simulate an error when trying to insert in the remote table.
drop table localtab;
drop table remotetab;
drop view localview;
create table localtab(
id number primary key,
name varchar2(30) not null)
/
create table remotetab(
id number primary key,
name varchar2(3) not null)
/
create view localview(id, name)
as
select id, name
from localtab
/
create trigger localtrig
instead of insert on localview
begin
insert into remotetab(id,name) values(:new.id, :new.name);
exception
when others then
insert into localtab(id,name) values(:new.id, :new.name);
end;
/
insert into localview(id, name) values(1,'a');
insert into localview(id, name) values(2,'hallo');
insert into localview(id, name) values('x','y');
select id, name remote from remotetab;
select id, name local from localtab;
The inserts return
1 row(s) inserted.
1 row(s) inserted.
ORA-01722: invalid number ORA-06512: at "SYS.DBMS_SQL", line 1721
and the selects return
id local
1 'a'
id remote
2 'hallo'
as expected.
add a comment |
I would use a view with instead_of trigger as GerhardH.Pille proposed. In my environment I cannot use database links so I use a local table called remotetab. Replace it by remotetab@remotedb if you have a database remotedb. In my example the length of the second columns of the local and the remote table differ. But this is only to simulate an error when trying to insert in the remote table.
drop table localtab;
drop table remotetab;
drop view localview;
create table localtab(
id number primary key,
name varchar2(30) not null)
/
create table remotetab(
id number primary key,
name varchar2(3) not null)
/
create view localview(id, name)
as
select id, name
from localtab
/
create trigger localtrig
instead of insert on localview
begin
insert into remotetab(id,name) values(:new.id, :new.name);
exception
when others then
insert into localtab(id,name) values(:new.id, :new.name);
end;
/
insert into localview(id, name) values(1,'a');
insert into localview(id, name) values(2,'hallo');
insert into localview(id, name) values('x','y');
select id, name remote from remotetab;
select id, name local from localtab;
The inserts return
1 row(s) inserted.
1 row(s) inserted.
ORA-01722: invalid number ORA-06512: at "SYS.DBMS_SQL", line 1721
and the selects return
id local
1 'a'
id remote
2 'hallo'
as expected.
add a comment |
I would use a view with instead_of trigger as GerhardH.Pille proposed. In my environment I cannot use database links so I use a local table called remotetab. Replace it by remotetab@remotedb if you have a database remotedb. In my example the length of the second columns of the local and the remote table differ. But this is only to simulate an error when trying to insert in the remote table.
drop table localtab;
drop table remotetab;
drop view localview;
create table localtab(
id number primary key,
name varchar2(30) not null)
/
create table remotetab(
id number primary key,
name varchar2(3) not null)
/
create view localview(id, name)
as
select id, name
from localtab
/
create trigger localtrig
instead of insert on localview
begin
insert into remotetab(id,name) values(:new.id, :new.name);
exception
when others then
insert into localtab(id,name) values(:new.id, :new.name);
end;
/
insert into localview(id, name) values(1,'a');
insert into localview(id, name) values(2,'hallo');
insert into localview(id, name) values('x','y');
select id, name remote from remotetab;
select id, name local from localtab;
The inserts return
1 row(s) inserted.
1 row(s) inserted.
ORA-01722: invalid number ORA-06512: at "SYS.DBMS_SQL", line 1721
and the selects return
id local
1 'a'
id remote
2 'hallo'
as expected.
I would use a view with instead_of trigger as GerhardH.Pille proposed. In my environment I cannot use database links so I use a local table called remotetab. Replace it by remotetab@remotedb if you have a database remotedb. In my example the length of the second columns of the local and the remote table differ. But this is only to simulate an error when trying to insert in the remote table.
drop table localtab;
drop table remotetab;
drop view localview;
create table localtab(
id number primary key,
name varchar2(30) not null)
/
create table remotetab(
id number primary key,
name varchar2(3) not null)
/
create view localview(id, name)
as
select id, name
from localtab
/
create trigger localtrig
instead of insert on localview
begin
insert into remotetab(id,name) values(:new.id, :new.name);
exception
when others then
insert into localtab(id,name) values(:new.id, :new.name);
end;
/
insert into localview(id, name) values(1,'a');
insert into localview(id, name) values(2,'hallo');
insert into localview(id, name) values('x','y');
select id, name remote from remotetab;
select id, name local from localtab;
The inserts return
1 row(s) inserted.
1 row(s) inserted.
ORA-01722: invalid number ORA-06512: at "SYS.DBMS_SQL", line 1721
and the selects return
id local
1 'a'
id remote
2 'hallo'
as expected.
answered Feb 20 '18 at 7:44
miracle173miracle173
6,5571837
6,5571837
add a comment |
add a comment |
Thanks for contributing an answer to Database Administrators Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f198257%2fmove-new-data-records-automatically-to-another-database-in-oracle-xe%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Use DataPump
– a_horse_with_no_name
Feb 19 '18 at 12:39
The best way is to use export and import, you can also customize according to object type, schema and many other options.
– Husam Mohamed
Feb 19 '18 at 12:41
What do you want to achieve with this complicated mechanism? Do you want that the data is stored in a local table if the remote database cannot be reached?
– miracle173
Feb 19 '18 at 15:32
@miracle173 Yes, thats exactly what i want to achieve. I also thought of using a c# application, but i wasn't able to delete the data records that have already been moved to the server.
– Gandalf The Gay
Feb 20 '18 at 6:10