473,748 Members | 2,670 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

On to Bulk Insert issues

Ted
OK, I tried this:

USE Alert_db;

BULK INSERT funds FROM 'C:\\data\\myDa ta.dat'
WITH (FIELDTERMINATO R='\t',
KEEPNULLS,
ROWTERMINATOR=' \r\n');
And I got the following errors.
Msg 4864, Level 16, State 1, Line 3
Bulk load data conversion error (type mismatch or invalid character for
the specified codepage) for row 1, column 4 (f_asset_classe s_id).
Msg 4866, Level 16, State 8, Line 3
The bulk load failed. The column is too long in the data file for row
1, column 6. Verify that the field terminator and row terminator are
specified correctly.
Msg 7399, Level 16, State 1, Line 3
The OLE DB provider "BULK" for linked server "(null)" reported an
error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 3
Cannot fetch a row from OLE DB provider "BULK" for linked server
"(null)".
One limitation I encountered is that there doesn't seem to be a way to
tell MS SQL Server that the fields are optionally enclosed by quotes.
That is, text fields are enclosed by quotes while, e.g., numeric
fields, are not, and that these optional quotes are NOT to be included
in the data in the fields.

I do not know what "State 1" vs "State 8" is supposed to mean.

The table in question in this example allows nulls in several columns,
and in the flat file, nulls are represented by consecutive tabs. Might
this be causing trouble for the Bulk Insert statement?

In other cases, where I have to use something like bulk insert,
involves several columns containing dates. I know MS SQL supports the
format used in the file (by reading the documentation for cast
operations), but is there an easy way to tell MS SQL which of the
supported date formats to use when reading this data. I've read bcp
should be useful for this, but I have yet to figure that out.

In about half of the cases where I load data from a file, the data is
loaded once when the database is first created, and in the rest, there
is new data to be loaded every business day; so I need to be able to
submit the required command from the command line, and thus invoke it
using a perl script.

BTW: I have ordered a couple books on T-SQL, but they have yet to
arrive.

Thanks

Ted

Aug 7 '06 #1
11 28347
Ted (r.*********@ro gers.com) writes:
One limitation I encountered is that there doesn't seem to be a way to
tell MS SQL Server that the fields are optionally enclosed by quotes.
That is, text fields are enclosed by quotes while, e.g., numeric
fields, are not, and that these optional quotes are NOT to be included
in the data in the fields.
That's correct, if optionally means just optionally, so that you
could have:

9;Some unquoted data;12;9.234;2 004-12-12
19;"Some quoted data";-12;31.4;2003-02-23

But if a text column is consistently quoted, you can handle this with a
format file where you specify each field. A format file that fits the
second row in the example above could look like:

8.0
5
1 SQLCHAR 0 0 ";\"" 1 col1 ""
2 SQLCHAR 0 0 "\";" 2 col2 ""
3 SQLCHAR 0 0 ";" 3 col3 ""
4 SQLCHAR 0 0 ";" 4 col3 ""
5 SQLCHAR 0 0 "\r\n" 5 col3 ""

The first row is the version of the file format. Next is the number of
fields in the file. Following lines describe one field each.

First column is record number. Second column is data type of the field
in the file. For a text file this is always SQLCHAR or always SQLNCHAR
for a Unicode file. Other data types are only used with binary formats.

The third column is prefix-length, used only for binary files. Fourth
column is the length, and is used for fixed-length fields. Fifth field
is the terminator, and it is here you specify the quotes.

Six column is the database column, with 1 denoting the first column. 0
means that this field is not to be imported. Seventh column is the
column name, but it's informational. BCP/BULK INSERT does not use it.
Last colunm is the collation for the data in the file.

Overall, keep in mind that BCP/BULK INSERT reads a binary file and a
row terminator is really only the terminator for the last field.
I do not know what "State 1" vs "State 8" is supposed to mean.
You can consider it as white noise. The state number may tell the
SQL Server developers something, but they are not documented.
The table in question in this example allows nulls in several columns,
and in the flat file, nulls are represented by consecutive tabs. Might
this be causing trouble for the Bulk Insert statement?
That should work fine. However, if fields are missing, so that you
have six fields on one line, and eight on the next, you lose.
--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Aug 7 '06 #2
hi all again...
it is a chance for me that such a topic has been opened :))
so i can ask something that i need to know...

well i am using BCP command to get a text file into SQL Server 2000
but i am getting an error message like this one :
"string data, right truncation"

and i have no idea how i am going to get over this problem!!
in what situations does the sql server 2000 return with such an error?

what should i do to get over this problem?

thanks a lot

Tunc Ovacik

*************** *************** *************** *************** *******

Erland Sommarskog wrote:
Ted (r.*********@ro gers.com) writes:
One limitation I encountered is that there doesn't seem to be a way to
tell MS SQL Server that the fields are optionally enclosed by quotes.
That is, text fields are enclosed by quotes while, e.g., numeric
fields, are not, and that these optional quotes are NOT to be included
in the data in the fields.

That's correct, if optionally means just optionally, so that you
could have:

9;Some unquoted data;12;9.234;2 004-12-12
19;"Some quoted data";-12;31.4;2003-02-23

But if a text column is consistently quoted, you can handle this with a
format file where you specify each field. A format file that fits the
second row in the example above could look like:

8.0
5
1 SQLCHAR 0 0 ";\"" 1 col1 ""
2 SQLCHAR 0 0 "\";" 2 col2 ""
3 SQLCHAR 0 0 ";" 3 col3 ""
4 SQLCHAR 0 0 ";" 4 col3 ""
5 SQLCHAR 0 0 "\r\n" 5 col3 ""

The first row is the version of the file format. Next is the number of
fields in the file. Following lines describe one field each.

First column is record number. Second column is data type of the field
in the file. For a text file this is always SQLCHAR or always SQLNCHAR
for a Unicode file. Other data types are only used with binary formats.

The third column is prefix-length, used only for binary files. Fourth
column is the length, and is used for fixed-length fields. Fifth field
is the terminator, and it is here you specify the quotes.

Six column is the database column, with 1 denoting the first column. 0
means that this field is not to be imported. Seventh column is the
column name, but it's informational. BCP/BULK INSERT does not use it.
Last colunm is the collation for the data in the file.

Overall, keep in mind that BCP/BULK INSERT reads a binary file and a
row terminator is really only the terminator for the last field.
I do not know what "State 1" vs "State 8" is supposed to mean.

You can consider it as white noise. The state number may tell the
SQL Server developers something, but they are not documented.
The table in question in this example allows nulls in several columns,
and in the flat file, nulls are represented by consecutive tabs. Might
this be causing trouble for the Bulk Insert statement?

That should work fine. However, if fields are missing, so that you
have six fields on one line, and eight on the next, you lose.
--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Aug 8 '06 #3
panic attack (tu*********@gm ail.com) writes:
it is a chance for me that such a topic has been opened :))
so i can ask something that i need to know...

well i am using BCP command to get a text file into SQL Server 2000
but i am getting an error message like this one :
"string data, right truncation"

and i have no idea how i am going to get over this problem!!
in what situations does the sql server 2000 return with such an error?

what should i do to get over this problem?
First start a new thread, so we can keep different problems apart.

You can try the -e option to get errors to a file, you will then see which
records in the file that provokes this error. I need however add the caveat
that not all errors get listed in the error file, and I don't remember if
this error gets lists.

The two most plausible reasons for the error is
1) the file has data that does not fit the table columns.
2) there is an error with your delimiters, so that BCP gets out of sync.

If that does not help, post the CREATE TABLE command for the table and the
exact command line for BCP you are using. If you use a format file, please
also include a format file. Finally, include a sample of the input file.
Best is if the sample produces the error message. If the input file
exceeds 80 characters in length include it as an attachment, so it does
not get wrecked in news transport.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Aug 8 '06 #4
Ted
Thanks Erland,

To deal with the quotes (because your example doesn't show what happens
if the first column in the file is quoted text), I tried the data
import wizard from within SQL Server Management Studio. For most of
the data files/table combinations, it worked well. However, there are
issues, especially related to nulls.

1) With one file, some of which goes into one tabe and some goes into
another, the last column contains null values, and consequently the
load fails. The key error seems to be:

Error 0xc020901c: Data Flow Task: There was an error with input column
"Dist_Unit" (67) on input "Destinatio n Input" (51). The column status
returned was: "The value could not be converted because of a potential
loss of data.".
(SQL Server Import and Export Wizard)

Dist_Unit is the last column and more often than not contains null
values.

This is especially puzzling since the wizard, when asked to show a
preview, properly displays the data without complaint.

2) With a different data file/table, there are two fields containing
integers, both of which could contain nulls, and while the wizard will
import the data without complaint, it silently converts the nulls to
zero. Is there a way to tell the wizard to keep the nulls as nulls?
If so, might this fix the problem in item #1?

3) I am trying to populate a lookup table from data used in item #1.
Of course, in that file, there will be multiple occurances of most
supplier code/supplier name pairs (one for each product supplied by the
supplier). This leads to the wizard complaining about violating the
primary key. Is there a way to tell the wizard to ignore duplicate
records?

4) Each time I tried the wizard, I told it to store a package on the
server. However, I can't seem to find these packages. Where should I
be looking for them, and can I tell SQL Server Management Studio to
export the packages as scripts I can invokve from the commandline?

Thanks

Ted

Aug 8 '06 #5
Ted
BTW: I examined the problematic data files using Open Office's Writer,
configured to show non-printable characters, and invariably the number
of fields is correct, with the right number of tabs.

Ted

Aug 8 '06 #6
Ted
I solved the problem with errors (item #1 in my previous post) by more
carefully specifying the data type of the input file columns. But this
leads to an equally serious problem. All the nulls in that column are
silently converted into zero. This represents a major distortion of
the meaning of the column. For this column, zero carries a very
different meaning from null. Worse, since the column had many records
in which the value was zero, it is not possible after the insert to
recover the nulls!

How can I tell the data import wizard to preserve my nulls?

Ted

Aug 8 '06 #7
Ted (r.*********@ro gers.com) writes:
To deal with the quotes (because your example doesn't show what happens
if the first column in the file is quoted text), I tried the data
import wizard from within SQL Server Management Studio. For most of
the data files/table combinations, it worked well. However, there are
issues, especially related to nulls.
Wait a minute, last night you were using BULK INSERT, now you are using
the Import Wizard which uses SQL Integration Services that I know next
to nothing about. So I cannot assist with that part.
4) Each time I tried the wizard, I told it to store a package on the
server. However, I can't seem to find these packages. Where should I
be looking for them, and can I tell SQL Server Management Studio to
export the packages as scripts I can invokve from the commandline?
As I understand it, Mgmt Studio does not offer any interface to
Integration Services. To this end you should use Business Intelligence
Development Studio.
I solved the problem with errors (item #1 in my previous post) by more
carefully specifying the data type of the input file columns. But this
leads to an equally serious problem. All the nulls in that column are
silently converted into zero.
Just a stupid check: you don't happen to have a default of 0 on those
columns.
--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Aug 8 '06 #8
Ted

Erland Sommarskog wrote:
Ted (r.*********@ro gers.com) writes:
To deal with the quotes (because your example doesn't show what happens
if the first column in the file is quoted text), I tried the data
import wizard from within SQL Server Management Studio. For most of
the data files/table combinations, it worked well. However, there are
issues, especially related to nulls.

Wait a minute, last night you were using BULK INSERT, now you are using
the Import Wizard which uses SQL Integration Services that I know next
to nothing about. So I cannot assist with that part.
Actually, I have been experimenting with BULK INSERT, bcp, and the
import wizard simultaneously. I find some inconsistency in how they
work. With bulk insert, it seems I can tell it to preserve my nulls,
and I haven't found out how to do that with the wizard (which is
accessable in SQL Server Management Studio by selecting the database
and then, from the popup menu selecting either import data or export
data).

OTOH, the wizard lets me specify in the second dialog that the text
fields are enclosed by quotes while it seems I may be only able to do
that by creating a format file for use by bcp or bulk insert.
4) Each time I tried the wizard, I told it to store a package on the
server. However, I can't seem to find these packages. Where should I
be looking for them, and can I tell SQL Server Management Studio to
export the packages as scripts I can invokve from the commandline?

As I understand it, Mgmt Studio does not offer any interface to
Integration Services. To this end you should use Business Intelligence
Development Studio.
It is accessable in Mgmt Studio through the popup menu accessible on
each database on the server.
I solved the problem with errors (item #1 in my previous post) by more
carefully specifying the data type of the input file columns. But this
leads to an equally serious problem. All the nulls in that column are
silently converted into zero.

Just a stupid check: you don't happen to have a default of 0 on those
columns.
No. On that particular table, there are no default values.

Thanks

Ted

Aug 8 '06 #9
many thanks...
my problem has been solved.

tunc

Erland Sommarskog wrote:
panic attack (tu*********@gm ail.com) writes:
it is a chance for me that such a topic has been opened :))
so i can ask something that i need to know...

well i am using BCP command to get a text file into SQL Server 2000
but i am getting an error message like this one :
"string data, right truncation"

and i have no idea how i am going to get over this problem!!
in what situations does the sql server 2000 return with such an error?

what should i do to get over this problem?

First start a new thread, so we can keep different problems apart.

You can try the -e option to get errors to a file, you will then see which
records in the file that provokes this error. I need however add the caveat
that not all errors get listed in the error file, and I don't remember if
this error gets lists.

The two most plausible reasons for the error is
1) the file has data that does not fit the table columns.
2) there is an error with your delimiters, so that BCP gets out of sync.

If that does not help, post the CREATE TABLE command for the table and the
exact command line for BCP you are using. If you use a format file, please
also include a format file. Finally, include a sample of the input file.
Best is if the sample produces the error message. If the input file
exceeds 80 characters in length include it as an attachment, so it does
not get wrecked in news transport.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Aug 9 '06 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
12990
by: Drew | last post by:
I am trying to use Bulk Insert for a user that is not sysadmin. I have already set up the user as a member of "bulkadmin". When I run the following script: DECLARE @SQL VARCHAR(1000) CREATE TABLE amdbo.#temp ( (10) NULL , (10) NULL , (10) NULL )
2
16507
by: php newbie | last post by:
Hello, I am trying to load a simple tab-delimited data file to SQL Server. I created a format file to go with it, since the data file differs from the destination table in number of columns. When I execute the query, I get an error saying that only sysadmin or bulkadmin roles are allowed to use the BULK INSERT statement. So, I proceeded with the Enterprise Manager to grant myself those roles. However, I could not find sysadmin or...
5
4900
by: me | last post by:
I'm also having problems getting the bulk insert to work. I don't know anything about it except what I've gleened from BOL but I'm not seeming to get anywhere...Hopefully there is some little (or big) problem with my code that someone can point out that may save me some time. TIA CBL
7
12129
by: iqbal | last post by:
Hi all, We have an application through which we are bulk inserting rows into a view. The definition of the view is such that it selects columns from a table on a remote server. I have added the servers using sp_addlinkedserver on both database servers. When I call the Commit API of oledb I get the following error: Error state: 1, Severity: 19, Server: TST-PROC22, Line#: 1, msg:
6
12357
by: pk | last post by:
Sorry for the piece-by-piece nature of this post, I moved it from a dormant group to this one and it was 3 separate posts in the other group. Anyway... I'm trying to bulk insert a text file of 10 columns into a table with 12. How can I specify which columns to insert to? I think format files are what I'm supposed to use, but I can't figure them out. I've also tried using a view, as was suggested on one of the many websites I've...
16
17017
by: Philip Boonzaaier | last post by:
I want to be able to generate SQL statements that will go through a list of data, effectively row by row, enquire on the database if this exists in the selected table- If it exists, then the colums must be UPDATED, if not, they must be INSERTED. Logically then, I would like to SELECT * FROM <TABLE> WHERE ....<Values entered here>, and then IF FOUND UPDATE <TABLE> SET .... <Values entered here> ELSE INSERT INTO <TABLE> VALUES <Values...
3
33237
by: moonriver | last post by:
Right now I develop an application to retrieve over 30,000 records from a binary file and then load them into a SQL Server DB. So far I load those records one by one, but the performance is very poor: it takes more than 6 hours to finish the loading. So could I make use of the bulk-insert mechanism of SQL Server to reduce the loading time in C# codes? or other performance improvement solutions?
0
2093
by: rshivaraman | last post by:
BULK INSERT bill_tbl FROM 'd:\ftp_Data\in\baddress.dat' WITH ( FIELDTERMINATOR = ';', ROWTERMINATOR = '\n' ) --------------------------------- This is the query used to populate bill_tbl. Actually this baddress.dat contain rowdelimiter of \r\n.
3
10907
by: bob laughland | last post by:
Hi All, I am using a combination of LINQ to SQL and bulk insert. In the process of performing 'one unit of work' I will be doing things like reading, and deleting records using LINQ to SQL and then inserting new records using bulk insert. One problem I am having is trying to use a 'transaction' to wrap around the whole thing.
0
8832
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
9561
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
9381
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
8252
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
6799
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
4608
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
4879
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
2
2791
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2217
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.