Issue :- Unable to import data into xml datatype
This issue is for all the database servers which have xml datatype for example DB - SQL Server, I wrote the steps to reproduce for DB - DB2 LUW
341 B
93 KB
94 KB
52 KB
331 B
55 KB
31 KB
122 KB
333 B
4 KB
127 KB
29 KB
333 B
31 KB
33 KB
28 KB
327 B
147 KB
7 KB
Hi Tom, Asif
We are able to reproduce the issue, but regarding the NOTE (in the issue), it is suggested that it is working fine for Table Editor functionality, but we are getting the same error for that part.
Can you please have a look at it, and suggest a workaround for that ?
Thanks,
Kunal
Hi Kunal,
Juhi is correct that this works from the table data editor. I don't think her xml file will work so I attached a new xml file to test with - xmltest1.csv You have to use the new load from file option to import the xml and save to a table. The import should work in a similar manner. You will have to convert the XML to VARCHAR to confirm that the load worked. Here are some of the statements so you don't have to retype them. This is for DB2. Not sure about the other databases.
Look at the batch logic used for the load from file. I suspect that the table data editor load from file is importing as a chunk rather than line by line. This feature was added to handle large blob/clob data.
Thanks,
Tom
DELETE FROM TOM.TOMXML GO SELECT XMLCAST((SELECT "c1" FROM TOM.TOMXML) AS VARCHAR(1000)) FROM SYSIBM.SYSDUMMY1 GO INSERT INTO "TOM"."TOMXML"("c1") VALUES('<customerinfo Cid="1015"> <name>Christine Haas</name> <addr country="Canada"> <street>12 Topgrove</street> <city>Toronto</city> <prov-state>Ontario</prov-state> <pcode-zip>N8X-7F8</pcode-zip> </addr> <phone type="work">905-555-5238</phone> <phone type="home">416-555-2934</phone> </customerinfo>') GO
Hi Kunal,
Juhi is correct that this works from the table data editor. I don't think her xml file will work so I attached a new xml file to test with - xmltest1.csv You have to use the new load from file option to import the xml and save to a table. The import should work in a similar manner. You will have to convert the XML to VARCHAR to confirm that the load worked. Here are some of the statements so you don't have to retype them. This is for DB2. Not sure about the other databases.
Look at the batch logic used for the load from file. I suspect that the table data editor load from file is importing as a chunk rather than line by line. This feature was added to handle large blob/clob data.
Thanks,
Tom
DELETE FROM TOM.TOMXML GO SELECT XMLCAST((SELECT "c1" FROM TOM.TOMXML) AS VARCHAR(1000)) FROM SYSIBM.SYSDUMMY1 GO INSERT INTO "TOM"."TOMXML"("c1") VALUES('<customerinfo Cid="1015"> <name>Christine Haas</name> <addr country="Canada"> <street>12 Topgrove</street> <city>Toronto</city> <prov-state>Ontario</prov-state> <pcode-zip>N8X-7F8</pcode-zip> </addr> <phone type="work">905-555-5238</phone> <phone type="home">416-555-2934</phone> </customerinfo>') GO
Hi Kunal,
Ok. Be careful when modifying excel as that goes down a whole different path. What about the other databases that support XML like Oracle and PostgreSQL?
I did not see your changes checked in so can't comment on the code fixes.
Thanks,
Tom
Hi Kunal,
Ok. Be careful when modifying excel as that goes down a whole different path. What about the other databases that support XML like Oracle and PostgreSQL?
I did not see your changes checked in so can't comment on the code fixes.
Thanks,
Tom
Hi Tom,
The change are specific to DB2 database in the code, I have tested the functionality for other databases that support XML, and they are working fine, so no changes required for them.
Thanks,
Kunal
Hi Tom,
The change are specific to DB2 database in the code, I have tested the functionality for other databases that support XML, and they are working fine, so no changes required for them.
Thanks,
Kunal
Fixed at Revision: 58943
FIXED:
Resolved Import of XML datatype for .csv file type and excel file type for Transaction Type='Batch'
Fixed at Revision: 58943
FIXED:
Resolved Import of XML datatype for .csv file type and excel file type for Transaction Type='Batch'
Ok. Thanks. Please also update and test CoreLibImportThread.java. It is kind of a copy of the importthread class that is used for FluidShell. Should be the same changes. Also, should you test that the quote is there at the beginning and end of the statement before doing a substring?
Ok. Thanks. Please also update and test CoreLibImportThread.java. It is kind of a copy of the importthread class that is used for FluidShell. Should be the same changes. Also, should you test that the quote is there at the beginning and end of the statement before doing a substring?
Hi Tom,
Thanks. Sure, I'll add the same changes to the CoreLibImportThread.java also.
No, we don't need to check for quotes before substring, because the method for creating the string (AFResultValueUtil.prepareXML()) will always add quotes for DB2, but if you suggest we can add a check also, to be more sure.
Thanks
Hi Tom,
Thanks. Sure, I'll add the same changes to the CoreLibImportThread.java also.
No, we don't need to check for quotes before substring, because the method for creating the string (AFResultValueUtil.prepareXML()) will always add quotes for DB2, but if you suggest we can add a check also, to be more sure.
Thanks
As long as it will always add the quotes - guaranteed; we should be ok without the check.
As long as it will always add the quotes - guaranteed; we should be ok without the check.
FIXED at Revision: 58951
FIXED:
Resolved Import of XML datatype for FluidShell
FIXED at Revision: 58951
FIXED:
Resolved Import of XML datatype for FluidShell
QA Team.
Remember to also test FluidShell sqlimport.
Thanks,
Tom
QA Team.
Remember to also test FluidShell sqlimport.
Thanks,
Tom
Hi Kunal,
I tested this with import and the fluidshell import and I still get the errors using the attached .csv file. Please provide your test case so I can see what I am doing wrong.
Thanks,
Tom
Hi Kunal,
I tested this with import and the fluidshell import and I still get the errors using the attached .csv file. Please provide your test case so I can see what I am doing wrong.
Thanks,
Tom
Hi Tom, the attached file is not in correct format, I have attached the correct format csv file, (test_xml_data.csv) and excel file (test_xml_data.xls)
and to import this file in fluidshell, we have to use this command,
sqlimport -SQ single-quote <TABLE_NAME> <Path to CSV File>
Please let me know, if you face any problems.
Thanks,
Tom
Hi Tom, the attached file is not in correct format, I have attached the correct format csv file, (test_xml_data.csv) and excel file (test_xml_data.xls)
and to import this file in fluidshell, we have to use this command,
sqlimport -SQ single-quote <TABLE_NAME> <Path to CSV File>
Please let me know, if you face any problems.
Thanks,
Tom
Hi Tom,
Sorry, I missed to add the -CN Option (to specify that the first row is header or not, default to True, we will set it to False) in the above comment, you can see the working command in the attached screenshot here.
Thanks,
Kunal
Hi Tom,
Sorry, I missed to add the -CN Option (to specify that the first row is header or not, default to True, we will set it to False) in the above comment, you can see the working command in the attached screenshot here.
Thanks,
Kunal
Hi kunal ,
Only csv file with single quotes are working . I am getting the same error without quotes. Apart from this .xls files are also throwing error . Hence, reopening the ticket
Thanks
Navjot
Hi kunal ,
Only csv file with single quotes are working . I am getting the same error without quotes. Apart from this .xls files are also throwing error . Hence, reopening the ticket
Thanks
Navjot
At Revision: 58970
FIXED:
At Revision: 58970
FIXED:
Hi Kunal,
Looks good. Please show screenshots of the settings for both import and sqlimport such as delimiter, string quoted identifier etc.. It will help the QA team and we can use that to add to our documentation.
Hi QA Team,
Please test this with different xml sample files in addition to the sample provided. Also, please regression test this across several different databases using .csv and .txt files to make sure we haven't created any other issues.
Thanks,
Tom
Hi Kunal,
Looks good. Please show screenshots of the settings for both import and sqlimport such as delimiter, string quoted identifier etc.. It will help the QA team and we can use that to add to our documentation.
Hi QA Team,
Please test this with different xml sample files in addition to the sample provided. Also, please regression test this across several different databases using .csv and .txt files to make sure we haven't created any other issues.
Thanks,
Tom
Hi Nadine,
I forgot to mention that this issue should be documented in the main documentation for import and fluidshell sqlimport. It is a small update. We can discuss.
Thanks,
Tom
Hi Nadine,
I forgot to mention that this issue should be documented in the main documentation for import and fluidshell sqlimport. It is a small update. We can discuss.
Thanks,
Tom
Hi Tom,
Attaching the Working Screenshots for ADS Import and Fluidshell Import functionality,
1. SQL Import when String Quote Identifier=Null (Delimiter and all other settings would work as expected, only change is if String Quote Identifier=null and file type is .xml, then we will read the file as a whole)
2. SQL Import if for the same file, we set String Quote Identifer='
3. For the sqlimport setting, the command that we will use to import XML data is,
sqlimport -SQ none -CN false -s TOM TOMXML xml_data.xml
This would import data with Transaction Type set to Full, to test it for Batch, we will add the flag
-TE Batch
Working Screenshot here
Thanks,
Kunal
Hi Tom,
Attaching the Working Screenshots for ADS Import and Fluidshell Import functionality,
1. SQL Import when String Quote Identifier=Null (Delimiter and all other settings would work as expected, only change is if String Quote Identifier=null and file type is .xml, then we will read the file as a whole)
2. SQL Import if for the same file, we set String Quote Identifer='
3. For the sqlimport setting, the command that we will use to import XML data is,
sqlimport -SQ none -CN false -s TOM TOMXML xml_data.xml
This would import data with Transaction Type set to Full, to test it for Batch, we will add the flag
-TE Batch
Working Screenshot here
Thanks,
Kunal
Hi Tom,
FIXED At Revision: 58973
Resolved Issue for Importing data with Escape Characters when Transaction Type='Batch' for DB2 in ADS Import and FluidShell.
This edge case was left, where when we have characters like (') in our data, for which we need to add Escape Characters to insert such data, but for Transaction Type='Batch', we don't need to add these special characters.
For e.g
data like this,
'<customerinfo Cid="1015"> <name>Christine Haas</name> <addr country="Canada"> <street>12 Topgrove</street> <city>Toronto</city> <prov-state>Ontario</prov-state> <pcode-zip>N8X-7F8</pcode-zip> </addr> <phone type="work">905-555-5238</phone> <phone type="home">416-555-2934</phone> </customerinfo>'
will be Imported easily, but if we replace single-quotes with double and vice-versa, then the code will add Escape Characters in order to add the single-quote in the database (which is not required while batch processing), so making the required changes.
Thanks,
Kunal
Hi Tom,
FIXED At Revision: 58973
Resolved Issue for Importing data with Escape Characters when Transaction Type='Batch' for DB2 in ADS Import and FluidShell.
This edge case was left, where when we have characters like (') in our data, for which we need to add Escape Characters to insert such data, but for Transaction Type='Batch', we don't need to add these special characters.
For e.g
data like this,
'<customerinfo Cid="1015"> <name>Christine Haas</name> <addr country="Canada"> <street>12 Topgrove</street> <city>Toronto</city> <prov-state>Ontario</prov-state> <pcode-zip>N8X-7F8</pcode-zip> </addr> <phone type="work">905-555-5238</phone> <phone type="home">416-555-2934</phone> </customerinfo>'
will be Imported easily, but if we replace single-quotes with double and vice-versa, then the code will add Escape Characters in order to add the single-quote in the database (which is not required while batch processing), so making the required changes.
Thanks,
Kunal
I also tested with a multi column .csv attached - TestXMLWithOtherColumns.csv. Looks like it is importing correctly.
Attached file xmltest.xml can be imported but must be tab delimited...
I also tested with a multi column .csv attached - TestXMLWithOtherColumns.csv. Looks like it is importing correctly.
Attached file xmltest.xml can be imported but must be tab delimited...
Verified by QA :
Different types of CSV / .TXT and .XML files have been tested along with importing these files through fluidshell on DB2-LUW database server .Please find the samples tested and scenarios tested here-- https://docs.google.com/spreadsheets/d/1AyKAbvNnc8rhIVJcvzXyywWly0YMLFdjNCCQ1iHe4ds/edit?pli=1#gid=0
Verified by QA :
Different types of CSV / .TXT and .XML files have been tested along with importing these files through fluidshell on DB2-LUW database server .Please find the samples tested and scenarios tested here-- https://docs.google.com/spreadsheets/d/1AyKAbvNnc8rhIVJcvzXyywWly0YMLFdjNCCQ1iHe4ds/edit?pli=1#gid=0
Issue #15783 |
Verified |
Fixed |
Resolved |
Completion |
No due date |
Fixed Build DEV BUILD 20 |
No time estimate |
Hi Tom, Asif
We are able to reproduce the issue, but regarding the NOTE (in the issue), it is suggested that it is working fine for Table Editor functionality, but we are getting the same error for that part.
Can you please have a look at it, and suggest a workaround for that ?
Thanks,
Kunal