× Heads up!

Aqua Data Studio / nhilam

Follow
IDE for Relational Databases
×
vaibhavladdha271(*) reported Jul 10, 2015  · SachinPrakash last modified Apr 10, 2017

Error on importing long varbinary datatype in vertica for Transaction type:Batch & Threshold


Priority Minor
Complexity Unknown
Component Tools - Import Tool
Version 18.0
Product: Aqua Data Studio
Version: 17.0.0-dev-74
Build #: 44661
Build Date: 2015-Jul-09 05:29:58 PM
 
Operating Environment: Linux (3.13.0-57-generic, amd64) / UTF-8 / en / IN / Oracle Corporation 1.8.0_40-b26
Memory: Max=704,643,072;  Total=296,222,720;  Free=126,323,648;  CPUs=8

Steps to reproduce:
1)Create a table using below script:
CREATE TABLE "public"."longvarbinary_datatype"  ( 
"c1" long varbinary(80) 
)
GO
2)Insert a record in table using below script:
INSERT INTO "public"."longvarbinary_datatype"("c1") 
VALUES('vaibhav123')
GO
3)Right click on created table & select Tools->export data.
Click on Next
4)Enter a file name 
Select "Delimited data" from Format option.
Click on next.

Data is exported successfully in file.

Please refer test.csv for exported data.
5)Right click on created table in Tree node & select Tools->Import data option.
Select previously saved file.
Keep default settings as it is & click on Next.
Click on Next & again click on Next.
6)Select Batch from Transaction type field.
Click on Next.

Data does not get imported successfully & an error message:"Error: Row: 1 -- [Vertica][JDBC](10120) Error converting data, invalid type for parameter: 1.
     SQL: INSERT INTO "public"."longvarbinary_datatype"("c1")  VALUES(?)" is displayed.

Works fine for Transaction type:Full
 
 


 

 

5 attachments

Issue #13456

Closed
Fixed
Resolved Aug 23, 2016
 
 
Completion
No due date
Fixed Build ADS 18.0.0-devi-208
No time estimate

About AquaClusters Privacy Policy Support Version - 19.0.2-4 AquaFold, Inc Copyright © 2007-2017