How to convert JSON data to BYTEA so I can store it in BYTEA column in the table in POSTGRESQL?












0















So, I came across the function script which is converting JSON data to BYTEA and then insert as a record in the table in a BYTEA column. (As I assumed what the code is doing)



In Oracle the function utl_raw.cast_to_raw converts the data to blob data and records the data in the table in Blob column. Giving the following output message, "anonymous block completed"



The following is the code,



CREATE OR REPLACE FUNCTION INS_BLOB() RETURNS VOID AS $$
DECLARE
v1 "TBL1"."COL1"%TYPE;
v2 "TBL1"."COL2"%TYPE;
BEGIN
v1 := utl_raw.cast_to_raw('{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}');

v2 := utl_raw.cast_to_raw('{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}');

INSERT INTO "TBL1" ("SN","COL1","COL2") values(1,v1, v2);
END;
$$

LANGUAGE 'plpgsql';
COMMIT;









share|improve this question

























  • Unrelated, but: you should really avoid those dreaded quoted identifiers ("COL1" - they are much more trouble than they are worth it).

    – a_horse_with_no_name
    1 hour ago











  • I agree, but I already made a leap on that one by making all of it case sensitive. :(

    – devilboy477
    59 mins ago
















0















So, I came across the function script which is converting JSON data to BYTEA and then insert as a record in the table in a BYTEA column. (As I assumed what the code is doing)



In Oracle the function utl_raw.cast_to_raw converts the data to blob data and records the data in the table in Blob column. Giving the following output message, "anonymous block completed"



The following is the code,



CREATE OR REPLACE FUNCTION INS_BLOB() RETURNS VOID AS $$
DECLARE
v1 "TBL1"."COL1"%TYPE;
v2 "TBL1"."COL2"%TYPE;
BEGIN
v1 := utl_raw.cast_to_raw('{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}');

v2 := utl_raw.cast_to_raw('{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}');

INSERT INTO "TBL1" ("SN","COL1","COL2") values(1,v1, v2);
END;
$$

LANGUAGE 'plpgsql';
COMMIT;









share|improve this question

























  • Unrelated, but: you should really avoid those dreaded quoted identifiers ("COL1" - they are much more trouble than they are worth it).

    – a_horse_with_no_name
    1 hour ago











  • I agree, but I already made a leap on that one by making all of it case sensitive. :(

    – devilboy477
    59 mins ago














0












0








0








So, I came across the function script which is converting JSON data to BYTEA and then insert as a record in the table in a BYTEA column. (As I assumed what the code is doing)



In Oracle the function utl_raw.cast_to_raw converts the data to blob data and records the data in the table in Blob column. Giving the following output message, "anonymous block completed"



The following is the code,



CREATE OR REPLACE FUNCTION INS_BLOB() RETURNS VOID AS $$
DECLARE
v1 "TBL1"."COL1"%TYPE;
v2 "TBL1"."COL2"%TYPE;
BEGIN
v1 := utl_raw.cast_to_raw('{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}');

v2 := utl_raw.cast_to_raw('{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}');

INSERT INTO "TBL1" ("SN","COL1","COL2") values(1,v1, v2);
END;
$$

LANGUAGE 'plpgsql';
COMMIT;









share|improve this question
















So, I came across the function script which is converting JSON data to BYTEA and then insert as a record in the table in a BYTEA column. (As I assumed what the code is doing)



In Oracle the function utl_raw.cast_to_raw converts the data to blob data and records the data in the table in Blob column. Giving the following output message, "anonymous block completed"



The following is the code,



CREATE OR REPLACE FUNCTION INS_BLOB() RETURNS VOID AS $$
DECLARE
v1 "TBL1"."COL1"%TYPE;
v2 "TBL1"."COL2"%TYPE;
BEGIN
v1 := utl_raw.cast_to_raw('{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}');

v2 := utl_raw.cast_to_raw('{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}');

INSERT INTO "TBL1" ("SN","COL1","COL2") values(1,v1, v2);
END;
$$

LANGUAGE 'plpgsql';
COMMIT;






postgresql json






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 5 mins ago







devilboy477

















asked 1 hour ago









devilboy477devilboy477

34




34













  • Unrelated, but: you should really avoid those dreaded quoted identifiers ("COL1" - they are much more trouble than they are worth it).

    – a_horse_with_no_name
    1 hour ago











  • I agree, but I already made a leap on that one by making all of it case sensitive. :(

    – devilboy477
    59 mins ago



















  • Unrelated, but: you should really avoid those dreaded quoted identifiers ("COL1" - they are much more trouble than they are worth it).

    – a_horse_with_no_name
    1 hour ago











  • I agree, but I already made a leap on that one by making all of it case sensitive. :(

    – devilboy477
    59 mins ago

















Unrelated, but: you should really avoid those dreaded quoted identifiers ("COL1" - they are much more trouble than they are worth it).

– a_horse_with_no_name
1 hour ago





Unrelated, but: you should really avoid those dreaded quoted identifiers ("COL1" - they are much more trouble than they are worth it).

– a_horse_with_no_name
1 hour ago













I agree, but I already made a leap on that one by making all of it case sensitive. :(

– devilboy477
59 mins ago





I agree, but I already made a leap on that one by making all of it case sensitive. :(

– devilboy477
59 mins ago










2 Answers
2






active

oldest

votes


















0














There is no reason to convert anything, just insert the JSON strings (after making them valid JSON):



CREATE OR REPLACE FUNCTION INS_BLOB() RETURNS VOID AS $$
BEGIN
INSERT INTO "TBL1" ("SN","COL1","COL2")
values(1,
'{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}}}}', --<< add the missing curly braces!
'{
"APPLICATION": {
"MEMORY": {
"OPTIONS" :{
"SOMETHING" : "SOMETHING",
"format" : "SOMETHING",
"System" : "",
"IP" : "",
"Port" : "",
"template" : "",
"Path" : "" ,
"Name" : "QUEUE",
"URL" : ""
}}}}'); --<< add the missing curly braces!
END;
$$
LANGUAGE plpgsql;
COMMIT;




You also don't need PL/pgSQL for this. A language sql function would be enough (you would need to remove the begin and end though)






share|improve this answer

































    0














    There is no reason to ever do that. PostgreSQL has a binary JSON type, jsonb. Just store your data as JSONB.



    CREATE TABLE foo (
    jsondata jsonb
    );

    INSERT INTO foo (jsondata) VALUES ( $${"foo": "bar"}$$ );


    This will give you a ton of operators and functions that will natively work with this type.
    JSONB are varlena (just like bytea) under the hood anyway



    See also,




    • JSONB Functions and Operators






    share|improve this answer


























    • Hi, I would appreciate if the answer was inclined towards my code which I am struggling to debug.

      – devilboy477
      1 hour ago











    • @devilboy477 I have no idea what you're even trying to do. PostgreSQL doesn't have utl_raw.cast_to_raw. It has a cast from text to binary json. It's implicit (happens in my answer above).

      – Evan Carroll
      1 hour ago











    • But, Evan how should I enter those values one at a time, I need something which compiles it together (squeeze in as a single data record) and inserts in the table column. PS: I am super new to postgres

      – devilboy477
      1 hour ago






    • 1





      Hi, I actually just inserted the values and it worked. But is there any way I can store it as a raw file in the records ?

      – devilboy477
      1 hour ago






    • 1





      @devilboy477: JSONB is a binary format (which compresses large values), why would you want anything else. What do you mean with "raw file in the record"? You are not using record variables there and I don't see any "files" either.

      – a_horse_with_no_name
      57 mins ago











    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "182"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f229476%2fhow-to-convert-json-data-to-bytea-so-i-can-store-it-in-bytea-column-in-the-table%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    There is no reason to convert anything, just insert the JSON strings (after making them valid JSON):



    CREATE OR REPLACE FUNCTION INS_BLOB() RETURNS VOID AS $$
    BEGIN
    INSERT INTO "TBL1" ("SN","COL1","COL2")
    values(1,
    '{
    "APPLICATION": {
    "MEMORY": {
    "OPTIONS" :{
    "SOMETHING" : "SOMETHING",
    "format" : "SOMETHING",
    "System" : "",
    "IP" : "",
    "Port" : "",
    "template" : "",
    "Path" : "" ,
    "Name" : "QUEUE",
    "URL" : ""
    }}}}', --<< add the missing curly braces!
    '{
    "APPLICATION": {
    "MEMORY": {
    "OPTIONS" :{
    "SOMETHING" : "SOMETHING",
    "format" : "SOMETHING",
    "System" : "",
    "IP" : "",
    "Port" : "",
    "template" : "",
    "Path" : "" ,
    "Name" : "QUEUE",
    "URL" : ""
    }}}}'); --<< add the missing curly braces!
    END;
    $$
    LANGUAGE plpgsql;
    COMMIT;




    You also don't need PL/pgSQL for this. A language sql function would be enough (you would need to remove the begin and end though)






    share|improve this answer






























      0














      There is no reason to convert anything, just insert the JSON strings (after making them valid JSON):



      CREATE OR REPLACE FUNCTION INS_BLOB() RETURNS VOID AS $$
      BEGIN
      INSERT INTO "TBL1" ("SN","COL1","COL2")
      values(1,
      '{
      "APPLICATION": {
      "MEMORY": {
      "OPTIONS" :{
      "SOMETHING" : "SOMETHING",
      "format" : "SOMETHING",
      "System" : "",
      "IP" : "",
      "Port" : "",
      "template" : "",
      "Path" : "" ,
      "Name" : "QUEUE",
      "URL" : ""
      }}}}', --<< add the missing curly braces!
      '{
      "APPLICATION": {
      "MEMORY": {
      "OPTIONS" :{
      "SOMETHING" : "SOMETHING",
      "format" : "SOMETHING",
      "System" : "",
      "IP" : "",
      "Port" : "",
      "template" : "",
      "Path" : "" ,
      "Name" : "QUEUE",
      "URL" : ""
      }}}}'); --<< add the missing curly braces!
      END;
      $$
      LANGUAGE plpgsql;
      COMMIT;




      You also don't need PL/pgSQL for this. A language sql function would be enough (you would need to remove the begin and end though)






      share|improve this answer




























        0












        0








        0







        There is no reason to convert anything, just insert the JSON strings (after making them valid JSON):



        CREATE OR REPLACE FUNCTION INS_BLOB() RETURNS VOID AS $$
        BEGIN
        INSERT INTO "TBL1" ("SN","COL1","COL2")
        values(1,
        '{
        "APPLICATION": {
        "MEMORY": {
        "OPTIONS" :{
        "SOMETHING" : "SOMETHING",
        "format" : "SOMETHING",
        "System" : "",
        "IP" : "",
        "Port" : "",
        "template" : "",
        "Path" : "" ,
        "Name" : "QUEUE",
        "URL" : ""
        }}}}', --<< add the missing curly braces!
        '{
        "APPLICATION": {
        "MEMORY": {
        "OPTIONS" :{
        "SOMETHING" : "SOMETHING",
        "format" : "SOMETHING",
        "System" : "",
        "IP" : "",
        "Port" : "",
        "template" : "",
        "Path" : "" ,
        "Name" : "QUEUE",
        "URL" : ""
        }}}}'); --<< add the missing curly braces!
        END;
        $$
        LANGUAGE plpgsql;
        COMMIT;




        You also don't need PL/pgSQL for this. A language sql function would be enough (you would need to remove the begin and end though)






        share|improve this answer















        There is no reason to convert anything, just insert the JSON strings (after making them valid JSON):



        CREATE OR REPLACE FUNCTION INS_BLOB() RETURNS VOID AS $$
        BEGIN
        INSERT INTO "TBL1" ("SN","COL1","COL2")
        values(1,
        '{
        "APPLICATION": {
        "MEMORY": {
        "OPTIONS" :{
        "SOMETHING" : "SOMETHING",
        "format" : "SOMETHING",
        "System" : "",
        "IP" : "",
        "Port" : "",
        "template" : "",
        "Path" : "" ,
        "Name" : "QUEUE",
        "URL" : ""
        }}}}', --<< add the missing curly braces!
        '{
        "APPLICATION": {
        "MEMORY": {
        "OPTIONS" :{
        "SOMETHING" : "SOMETHING",
        "format" : "SOMETHING",
        "System" : "",
        "IP" : "",
        "Port" : "",
        "template" : "",
        "Path" : "" ,
        "Name" : "QUEUE",
        "URL" : ""
        }}}}'); --<< add the missing curly braces!
        END;
        $$
        LANGUAGE plpgsql;
        COMMIT;




        You also don't need PL/pgSQL for this. A language sql function would be enough (you would need to remove the begin and end though)







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited 50 mins ago

























        answered 1 hour ago









        a_horse_with_no_namea_horse_with_no_name

        39.7k775112




        39.7k775112

























            0














            There is no reason to ever do that. PostgreSQL has a binary JSON type, jsonb. Just store your data as JSONB.



            CREATE TABLE foo (
            jsondata jsonb
            );

            INSERT INTO foo (jsondata) VALUES ( $${"foo": "bar"}$$ );


            This will give you a ton of operators and functions that will natively work with this type.
            JSONB are varlena (just like bytea) under the hood anyway



            See also,




            • JSONB Functions and Operators






            share|improve this answer


























            • Hi, I would appreciate if the answer was inclined towards my code which I am struggling to debug.

              – devilboy477
              1 hour ago











            • @devilboy477 I have no idea what you're even trying to do. PostgreSQL doesn't have utl_raw.cast_to_raw. It has a cast from text to binary json. It's implicit (happens in my answer above).

              – Evan Carroll
              1 hour ago











            • But, Evan how should I enter those values one at a time, I need something which compiles it together (squeeze in as a single data record) and inserts in the table column. PS: I am super new to postgres

              – devilboy477
              1 hour ago






            • 1





              Hi, I actually just inserted the values and it worked. But is there any way I can store it as a raw file in the records ?

              – devilboy477
              1 hour ago






            • 1





              @devilboy477: JSONB is a binary format (which compresses large values), why would you want anything else. What do you mean with "raw file in the record"? You are not using record variables there and I don't see any "files" either.

              – a_horse_with_no_name
              57 mins ago
















            0














            There is no reason to ever do that. PostgreSQL has a binary JSON type, jsonb. Just store your data as JSONB.



            CREATE TABLE foo (
            jsondata jsonb
            );

            INSERT INTO foo (jsondata) VALUES ( $${"foo": "bar"}$$ );


            This will give you a ton of operators and functions that will natively work with this type.
            JSONB are varlena (just like bytea) under the hood anyway



            See also,




            • JSONB Functions and Operators






            share|improve this answer


























            • Hi, I would appreciate if the answer was inclined towards my code which I am struggling to debug.

              – devilboy477
              1 hour ago











            • @devilboy477 I have no idea what you're even trying to do. PostgreSQL doesn't have utl_raw.cast_to_raw. It has a cast from text to binary json. It's implicit (happens in my answer above).

              – Evan Carroll
              1 hour ago











            • But, Evan how should I enter those values one at a time, I need something which compiles it together (squeeze in as a single data record) and inserts in the table column. PS: I am super new to postgres

              – devilboy477
              1 hour ago






            • 1





              Hi, I actually just inserted the values and it worked. But is there any way I can store it as a raw file in the records ?

              – devilboy477
              1 hour ago






            • 1





              @devilboy477: JSONB is a binary format (which compresses large values), why would you want anything else. What do you mean with "raw file in the record"? You are not using record variables there and I don't see any "files" either.

              – a_horse_with_no_name
              57 mins ago














            0












            0








            0







            There is no reason to ever do that. PostgreSQL has a binary JSON type, jsonb. Just store your data as JSONB.



            CREATE TABLE foo (
            jsondata jsonb
            );

            INSERT INTO foo (jsondata) VALUES ( $${"foo": "bar"}$$ );


            This will give you a ton of operators and functions that will natively work with this type.
            JSONB are varlena (just like bytea) under the hood anyway



            See also,




            • JSONB Functions and Operators






            share|improve this answer















            There is no reason to ever do that. PostgreSQL has a binary JSON type, jsonb. Just store your data as JSONB.



            CREATE TABLE foo (
            jsondata jsonb
            );

            INSERT INTO foo (jsondata) VALUES ( $${"foo": "bar"}$$ );


            This will give you a ton of operators and functions that will natively work with this type.
            JSONB are varlena (just like bytea) under the hood anyway



            See also,




            • JSONB Functions and Operators







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited 1 hour ago

























            answered 1 hour ago









            Evan CarrollEvan Carroll

            32.2k969219




            32.2k969219













            • Hi, I would appreciate if the answer was inclined towards my code which I am struggling to debug.

              – devilboy477
              1 hour ago











            • @devilboy477 I have no idea what you're even trying to do. PostgreSQL doesn't have utl_raw.cast_to_raw. It has a cast from text to binary json. It's implicit (happens in my answer above).

              – Evan Carroll
              1 hour ago











            • But, Evan how should I enter those values one at a time, I need something which compiles it together (squeeze in as a single data record) and inserts in the table column. PS: I am super new to postgres

              – devilboy477
              1 hour ago






            • 1





              Hi, I actually just inserted the values and it worked. But is there any way I can store it as a raw file in the records ?

              – devilboy477
              1 hour ago






            • 1





              @devilboy477: JSONB is a binary format (which compresses large values), why would you want anything else. What do you mean with "raw file in the record"? You are not using record variables there and I don't see any "files" either.

              – a_horse_with_no_name
              57 mins ago



















            • Hi, I would appreciate if the answer was inclined towards my code which I am struggling to debug.

              – devilboy477
              1 hour ago











            • @devilboy477 I have no idea what you're even trying to do. PostgreSQL doesn't have utl_raw.cast_to_raw. It has a cast from text to binary json. It's implicit (happens in my answer above).

              – Evan Carroll
              1 hour ago











            • But, Evan how should I enter those values one at a time, I need something which compiles it together (squeeze in as a single data record) and inserts in the table column. PS: I am super new to postgres

              – devilboy477
              1 hour ago






            • 1





              Hi, I actually just inserted the values and it worked. But is there any way I can store it as a raw file in the records ?

              – devilboy477
              1 hour ago






            • 1





              @devilboy477: JSONB is a binary format (which compresses large values), why would you want anything else. What do you mean with "raw file in the record"? You are not using record variables there and I don't see any "files" either.

              – a_horse_with_no_name
              57 mins ago

















            Hi, I would appreciate if the answer was inclined towards my code which I am struggling to debug.

            – devilboy477
            1 hour ago





            Hi, I would appreciate if the answer was inclined towards my code which I am struggling to debug.

            – devilboy477
            1 hour ago













            @devilboy477 I have no idea what you're even trying to do. PostgreSQL doesn't have utl_raw.cast_to_raw. It has a cast from text to binary json. It's implicit (happens in my answer above).

            – Evan Carroll
            1 hour ago





            @devilboy477 I have no idea what you're even trying to do. PostgreSQL doesn't have utl_raw.cast_to_raw. It has a cast from text to binary json. It's implicit (happens in my answer above).

            – Evan Carroll
            1 hour ago













            But, Evan how should I enter those values one at a time, I need something which compiles it together (squeeze in as a single data record) and inserts in the table column. PS: I am super new to postgres

            – devilboy477
            1 hour ago





            But, Evan how should I enter those values one at a time, I need something which compiles it together (squeeze in as a single data record) and inserts in the table column. PS: I am super new to postgres

            – devilboy477
            1 hour ago




            1




            1





            Hi, I actually just inserted the values and it worked. But is there any way I can store it as a raw file in the records ?

            – devilboy477
            1 hour ago





            Hi, I actually just inserted the values and it worked. But is there any way I can store it as a raw file in the records ?

            – devilboy477
            1 hour ago




            1




            1





            @devilboy477: JSONB is a binary format (which compresses large values), why would you want anything else. What do you mean with "raw file in the record"? You are not using record variables there and I don't see any "files" either.

            – a_horse_with_no_name
            57 mins ago





            @devilboy477: JSONB is a binary format (which compresses large values), why would you want anything else. What do you mean with "raw file in the record"? You are not using record variables there and I don't see any "files" either.

            – a_horse_with_no_name
            57 mins ago


















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Database Administrators Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f229476%2fhow-to-convert-json-data-to-bytea-so-i-can-store-it-in-bytea-column-in-the-table%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            SQL Server 17 - Attemping to backup to remote NAS but Access is denied

            Always On Availability groups resolving state after failover - Remote harden of transaction...

            Restoring from pg_dump with foreign key constraints