By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
457,985 Members | 986 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 457,985 IT Pros & Developers. It's quick & easy.

TransferText dropping fractions

P: n/a
I've got a csv that I am trying to import into a SQL Server (2000)
table through an adp (Access 2000). The form used gives the user the
ability to browse for a file, then will import the file into a temp
table for processing using this vba code:

DoCmd.TransferText acImportDelim, , "tmpPaTimeClock2", Me!TimeFileName,
False

When I run the import through the code, the import is dropping any
values less than one (apparently). I've got a couple of columns that
have .5 for the value, but these are brought in as 0. Decimal places
are being imported (ex. 532.28 comes in correctly as 523.28)

I've tried changing the datatype in the temp table, but that doesn't
seem to make a difference.
I've tried letting the import create a new table, but that doesn't fix
the problem either.

If I import the file directly through the adp (right click - import)
all of the values are brought in correctly.

Anyone know how to fix this, or an alternative? I did some reading on
bcp and Bulk Copy, but I couldn't see an easy way to allow for the
variable path name to the file.

Dec 20 '05 #1
Share this Question
Share on Google+
12 Replies


P: n/a
Have you tried using a saved import spec with the data typ defined?

--
Danny J. Lesandrini
dl*********@hotmail.com
http://amazecreations.com/datafast
<ck****@mindspring.com> wrote ...
I've got a csv that I am trying to import into a SQL Server (2000)
table through an adp (Access 2000). The form used gives the user the
ability to browse for a file, then will import the file into a temp
table for processing using this vba code:

DoCmd.TransferText acImportDelim, , "tmpPaTimeClock2", Me!TimeFileName, False

Dec 20 '05 #2

P: n/a
Danny's probably closer on the ball than I am here, but after trying
his suggestion I would consider changing some of your .5 values to 0.5
and seeing how they import.

Dec 20 '05 #3

P: n/a
I don't seem to get the option to save the import spec when I define
one. I assumed that import specs can only be saved when dealing with
and .mdb. I could be wrong on that though.

Dec 20 '05 #4

P: n/a
ck****@mindspring.com wrote:
I don't seem to get the option to save the import spec when I define
one. I assumed that import specs can only be saved when dealing with
and .mdb. I could be wrong on that though.

Click on your Advanced Button when defining specs.
Dec 21 '05 #5

P: n/a
(ck****@mindspring.com) writes:
Anyone know how to fix this, or an alternative? I did some reading on
bcp and Bulk Copy, but I couldn't see an easy way to allow for the
variable path name to the file.


For BCP this is just part of the command line, so it should be problem
to build that command line.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Dec 22 '05 #6

P: n/a
After a bit of reading on bcp, it appears that Bulk Insert is a good
fit for what I am trying to accomplish (and I can run it through
T-SQL!!)
But I'm getting an error that seems to indicate that the OS can't find
the file.

Here's the code I'm using:
BULK INSERT dbo.tmpPaTimeClock FROM 'C:\Program Files\Traverse
CWIN\PER_HRS1.CSV'
Error :
Server: Msg 4861, Level 16, State 1, Line 1
Could not bulk insert because file 'C:\Program Files\Traverse
CWIN\PER_HRS1.CSV' could not be opened. Operating system error code
3(The system cannot find the path specified.).
That path is on my local machine.. Does Bulk Insert operate from the
Server box, so the path I enter has to be a path to a folder on the
machine where the SQL Server is running?

Dec 27 '05 #7

P: n/a
(ck****@mindspring.com) writes:
After a bit of reading on bcp, it appears that Bulk Insert is a good
fit for what I am trying to accomplish (and I can run it through
T-SQL!!)
But I'm getting an error that seems to indicate that the OS can't find
the file.

Here's the code I'm using:
BULK INSERT dbo.tmpPaTimeClock FROM 'C:\Program Files\Traverse
CWIN\PER_HRS1.CSV'
Error :
Server: Msg 4861, Level 16, State 1, Line 1
Could not bulk insert because file 'C:\Program Files\Traverse
CWIN\PER_HRS1.CSV' could not be opened. Operating system error code
3(The system cannot find the path specified.).
That path is on my local machine.. Does Bulk Insert operate from the
Server box, so the path I enter has to be a path to a folder on the
machine where the SQL Server is running?


Yes, that's it. SQL Server is a server, and all file specifications are
in the context of the server. If your disk can be reached throgh a UNC
path from the server, it can still read from your box. (Provided that
SQL Server is running under a domain user, I should add.)
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Dec 27 '05 #8

P: n/a
The file is actually stored on the server, so the path shouldn't be a
problem.

However, I'm getting an error returned from this Bulk Insert command:
BULK INSERT dbo.tmpPaTimeClock
FROM '\\Csi\c\Tmp\PER_HRS1.CSV' WITH (DATAFILETYPE = 'char', FIRSTROW
=1)
Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1,
column 1. Make sure the field terminator and row terminator are
specified correctly.
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider 'STREAM' reported an error. The provider did not give
any information about the error.
OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows
returned 0x80004005: The provider did not give any information about
the error.].
The statement has been terminated.

Here's the first couple of rows from the import file:

Employee,Range of timeclock poll,,Earning
Codes,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
96,11/29/2005,12/5/2005,REGULAR,OVERTIME,DBL-TIME,SICK,HOLIDAY,VACATION,SAT
OT,MON OT,JURY,HOL
OT,BRVMNT.,PD12,PD13,PD14,PD15,PD16,PD17,PD18,PD19 ,PD20,PD21,PD22,PD23,PD24,PD25,PD26,PD27,PD28,PD29 ,PD30
,6790,6790,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0
"ACEVEDO, JULIO
",1388,3107,40,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0
"AGULAR, MARTHA
",1401,4401,40,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0

When I alter the command to :
BULK INSERT dbo.tmpPaTimeClock
FROM '\\Csi\c\Tmp\PER_HRS1.CSV' WITH (DATAFILETYPE = 'char', FIRSTROW
=3) the error goes away (skipping to first 2 rows), but no records are
imported???

Dec 27 '05 #9

P: n/a
(ck****@mindspring.com) writes:
The file is actually stored on the server, so the path shouldn't be a
problem.

However, I'm getting an error returned from this Bulk Insert command:
BULK INSERT dbo.tmpPaTimeClock
FROM '\\Csi\c\Tmp\PER_HRS1.CSV' WITH (DATAFILETYPE = 'char', FIRSTROW
=1)
Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1,
column 1. Make sure the field terminator and row terminator are
specified correctly.


How about reading the error message? :-) It says: "Make sure the field
terminator and row terminator are specified correctly.". Did you do
that? No, the default field terminator is tab.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Dec 27 '05 #10

P: n/a
thanks for the 'Tough Love'.. I modified the BULK INSERT to :

BULK INSERT dbo.tmpPaTimeClock
FROM Path... WITH (DATAFILETYPE = 'char',
, FIRSTROW = 3
, FIELDTERMINATOR = ','
, ROWTERMINATOR = '\n')

and that did the trick. It splits the first column in 2 because of the
comma seperating the names, but that's really not a problem for me,
since I'm not using that field(s). But if anyone know how to prevent
that , I'd like to learn.

Dec 28 '05 #11

P: n/a
I spoke too soon. The name field isn't uniform, some records have the
double quotes around it, with a comma splitting the name, others don't,
so those are throwing the columns off by one.

Dec 28 '05 #12

P: n/a
(ck****@mindspring.com) writes:
I spoke too soon. The name field isn't uniform, some records have the
double quotes around it, with a comma splitting the name, others don't,
so those are throwing the columns off by one.


That's when you need a format file. The good news is that the format file
will drop your column headers as well.

The format file for the full file is kind of boring, so I only give an
example for a file with 5 fields, where the first is quoted, and the rest
is not. (Indendation is for formatting of the post only. All text starts
in column 1.)

8.0
6
1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\"," 1 col1 ""
3 SQLCHAR 0 0 "," 2 col1 ""
4 SQLCHAR 0 0 "," 3 col1 ""
5 SQLCHAR 0 0 "," 4 col1 ""
6 SQLCHAR 0 0 "\r\n" 5 col1 ""

The trick here is that we define the file as having six field - there is
an empty field before the first quote. SQLCHAR is the datatype in the
file, and since this is a text file, everything is SQLCHAR. (Or SQLNCHAR
if it is a Unicode file.) The next two columns are for binary files,
and fixed-length fields. The comes the field terminator. Notice that
BCP does not really have a notion of a row terminator; the row terminator
is just the field terminator for the field.

The column that reads 0 1 2 3 4 5 is the mapping to the table columns.
0 means that that field is not imported. 1 is the first field etc. Next
column is for table-column names, but that's informational only. Finally,
the last column is where you can specify collation.

Note that the first record as far as BCP is concerned, consists of the
column headers + the first data row. The first field of the first row,
is everything up to the first ".

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Dec 28 '05 #13

This discussion thread is closed

Replies have been disabled for this discussion.