473,231 Members | 1,809 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,231 software developers and data experts.

TransferText dropping fractions

I've got a csv that I am trying to import into a SQL Server (2000)
table through an adp (Access 2000). The form used gives the user the
ability to browse for a file, then will import the file into a temp
table for processing using this vba code:

DoCmd.TransferText acImportDelim, , "tmpPaTimeClock2", Me!TimeFileName,
False

When I run the import through the code, the import is dropping any
values less than one (apparently). I've got a couple of columns that
have .5 for the value, but these are brought in as 0. Decimal places
are being imported (ex. 532.28 comes in correctly as 523.28)

I've tried changing the datatype in the temp table, but that doesn't
seem to make a difference.
I've tried letting the import create a new table, but that doesn't fix
the problem either.

If I import the file directly through the adp (right click - import)
all of the values are brought in correctly.

Anyone know how to fix this, or an alternative? I did some reading on
bcp and Bulk Copy, but I couldn't see an easy way to allow for the
variable path name to the file.

Dec 20 '05 #1
12 2151
Have you tried using a saved import spec with the data typ defined?

--
Danny J. Lesandrini
dl*********@hotmail.com
http://amazecreations.com/datafast
<ck****@mindspring.com> wrote ...
I've got a csv that I am trying to import into a SQL Server (2000)
table through an adp (Access 2000). The form used gives the user the
ability to browse for a file, then will import the file into a temp
table for processing using this vba code:

DoCmd.TransferText acImportDelim, , "tmpPaTimeClock2", Me!TimeFileName, False

Dec 20 '05 #2
Danny's probably closer on the ball than I am here, but after trying
his suggestion I would consider changing some of your .5 values to 0.5
and seeing how they import.

Dec 20 '05 #3
I don't seem to get the option to save the import spec when I define
one. I assumed that import specs can only be saved when dealing with
and .mdb. I could be wrong on that though.

Dec 20 '05 #4
ck****@mindspring.com wrote:
I don't seem to get the option to save the import spec when I define
one. I assumed that import specs can only be saved when dealing with
and .mdb. I could be wrong on that though.

Click on your Advanced Button when defining specs.
Dec 21 '05 #5
(ck****@mindspring.com) writes:
Anyone know how to fix this, or an alternative? I did some reading on
bcp and Bulk Copy, but I couldn't see an easy way to allow for the
variable path name to the file.


For BCP this is just part of the command line, so it should be problem
to build that command line.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Dec 22 '05 #6
After a bit of reading on bcp, it appears that Bulk Insert is a good
fit for what I am trying to accomplish (and I can run it through
T-SQL!!)
But I'm getting an error that seems to indicate that the OS can't find
the file.

Here's the code I'm using:
BULK INSERT dbo.tmpPaTimeClock FROM 'C:\Program Files\Traverse
CWIN\PER_HRS1.CSV'
Error :
Server: Msg 4861, Level 16, State 1, Line 1
Could not bulk insert because file 'C:\Program Files\Traverse
CWIN\PER_HRS1.CSV' could not be opened. Operating system error code
3(The system cannot find the path specified.).
That path is on my local machine.. Does Bulk Insert operate from the
Server box, so the path I enter has to be a path to a folder on the
machine where the SQL Server is running?

Dec 27 '05 #7
(ck****@mindspring.com) writes:
After a bit of reading on bcp, it appears that Bulk Insert is a good
fit for what I am trying to accomplish (and I can run it through
T-SQL!!)
But I'm getting an error that seems to indicate that the OS can't find
the file.

Here's the code I'm using:
BULK INSERT dbo.tmpPaTimeClock FROM 'C:\Program Files\Traverse
CWIN\PER_HRS1.CSV'
Error :
Server: Msg 4861, Level 16, State 1, Line 1
Could not bulk insert because file 'C:\Program Files\Traverse
CWIN\PER_HRS1.CSV' could not be opened. Operating system error code
3(The system cannot find the path specified.).
That path is on my local machine.. Does Bulk Insert operate from the
Server box, so the path I enter has to be a path to a folder on the
machine where the SQL Server is running?


Yes, that's it. SQL Server is a server, and all file specifications are
in the context of the server. If your disk can be reached throgh a UNC
path from the server, it can still read from your box. (Provided that
SQL Server is running under a domain user, I should add.)
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Dec 27 '05 #8
The file is actually stored on the server, so the path shouldn't be a
problem.

However, I'm getting an error returned from this Bulk Insert command:
BULK INSERT dbo.tmpPaTimeClock
FROM '\\Csi\c\Tmp\PER_HRS1.CSV' WITH (DATAFILETYPE = 'char', FIRSTROW
=1)
Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1,
column 1. Make sure the field terminator and row terminator are
specified correctly.
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider 'STREAM' reported an error. The provider did not give
any information about the error.
OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows
returned 0x80004005: The provider did not give any information about
the error.].
The statement has been terminated.

Here's the first couple of rows from the import file:

Employee,Range of timeclock poll,,Earning
Codes,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
96,11/29/2005,12/5/2005,REGULAR,OVERTIME,DBL-TIME,SICK,HOLIDAY,VACATION,SAT
OT,MON OT,JURY,HOL
OT,BRVMNT.,PD12,PD13,PD14,PD15,PD16,PD17,PD18,PD19 ,PD20,PD21,PD22,PD23,PD24,PD25,PD26,PD27,PD28,PD29 ,PD30
,6790,6790,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0
"ACEVEDO, JULIO
",1388,3107,40,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0
"AGULAR, MARTHA
",1401,4401,40,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0 ,0,0,0,0,0,0,0,0,0,0,0

When I alter the command to :
BULK INSERT dbo.tmpPaTimeClock
FROM '\\Csi\c\Tmp\PER_HRS1.CSV' WITH (DATAFILETYPE = 'char', FIRSTROW
=3) the error goes away (skipping to first 2 rows), but no records are
imported???

Dec 27 '05 #9
(ck****@mindspring.com) writes:
The file is actually stored on the server, so the path shouldn't be a
problem.

However, I'm getting an error returned from this Bulk Insert command:
BULK INSERT dbo.tmpPaTimeClock
FROM '\\Csi\c\Tmp\PER_HRS1.CSV' WITH (DATAFILETYPE = 'char', FIRSTROW
=1)
Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1,
column 1. Make sure the field terminator and row terminator are
specified correctly.


How about reading the error message? :-) It says: "Make sure the field
terminator and row terminator are specified correctly.". Did you do
that? No, the default field terminator is tab.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Dec 27 '05 #10
thanks for the 'Tough Love'.. I modified the BULK INSERT to :

BULK INSERT dbo.tmpPaTimeClock
FROM Path... WITH (DATAFILETYPE = 'char',
, FIRSTROW = 3
, FIELDTERMINATOR = ','
, ROWTERMINATOR = '\n')

and that did the trick. It splits the first column in 2 because of the
comma seperating the names, but that's really not a problem for me,
since I'm not using that field(s). But if anyone know how to prevent
that , I'd like to learn.

Dec 28 '05 #11
I spoke too soon. The name field isn't uniform, some records have the
double quotes around it, with a comma splitting the name, others don't,
so those are throwing the columns off by one.

Dec 28 '05 #12
(ck****@mindspring.com) writes:
I spoke too soon. The name field isn't uniform, some records have the
double quotes around it, with a comma splitting the name, others don't,
so those are throwing the columns off by one.


That's when you need a format file. The good news is that the format file
will drop your column headers as well.

The format file for the full file is kind of boring, so I only give an
example for a file with 5 fields, where the first is quoted, and the rest
is not. (Indendation is for formatting of the post only. All text starts
in column 1.)

8.0
6
1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\"," 1 col1 ""
3 SQLCHAR 0 0 "," 2 col1 ""
4 SQLCHAR 0 0 "," 3 col1 ""
5 SQLCHAR 0 0 "," 4 col1 ""
6 SQLCHAR 0 0 "\r\n" 5 col1 ""

The trick here is that we define the file as having six field - there is
an empty field before the first quote. SQLCHAR is the datatype in the
file, and since this is a text file, everything is SQLCHAR. (Or SQLNCHAR
if it is a Unicode file.) The next two columns are for binary files,
and fixed-length fields. The comes the field terminator. Notice that
BCP does not really have a notion of a row terminator; the row terminator
is just the field terminator for the field.

The column that reads 0 1 2 3 4 5 is the mapping to the table columns.
0 means that that field is not imported. 1 is the first field etc. Next
column is for table-column names, but that's informational only. Finally,
the last column is where you can specify collation.

Note that the first record as far as BCP is concerned, consists of the
column headers + the first data row. The first field of the first row,
is everything up to the first ".

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Dec 28 '05 #13

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

33
by: selowitch | last post by:
I've been searching in vain for a way to present typographically correct fractions (not resorting to <sup> and <sub> tags) but have been frustrated by the fact that the glyphs for one-half,...
2
by: BigData | last post by:
I am attempting to create a delimited text file by using DoCmd.Transfertext as shown here. DoCmd.TransferText acExportDelim, "ExportSpec", "QryFinalExport", "Fileout.txt" This works fine as...
3
by: Oliver Gabriel | last post by:
Hi, i want to export a table for later import, using vba. That´s my code: export: filename = "C:\HVOtabelle.txt"
12
by: ckirby | last post by:
I've got a csv that I am trying to import into a SQL Server (2000) table through an adp (Access 2000). The form used gives the user the ability to browse for a file, then will import the file into...
3
by: Jim M | last post by:
I am trying to 'grab' a backend data path and filename from a text file (to be used when updated front ends are installed by users). I store this information by running: DoCmd.TransferText...
0
by: Sean Howard | last post by:
I have a strange problem linking tab delimited text files in Access 2000 (I am running Windows XP), please try this and let me know if I am going mad. Step 1. Create the tab-delimited text...
0
by: stuart | last post by:
I seem to have a problem with the use of the TransferText function. In 2 applications that I have, every few months, it seems to not export a few records from a table. In another application,...
1
by: Semajthewise | last post by:
Here it is cleaned up a little more. Here's what this code does. It will take 2 fractions and add, subtract, multiply, or divide them. The user enters the fractions to be calculated into two...
0
by: Paddy | last post by:
(From: http://paddy3118.blogspot.com/2008/09/python-fractions-issue.html) There seems to be a problem/difference in calculating with the new fractions module when comparing Python 26rc2 and 30rc1...
3
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 3 Jan 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). For other local times, please check World Time Buddy In...
0
by: jianzs | last post by:
Introduction Cloud-native applications are conventionally identified as those designed and nurtured on cloud infrastructure. Such applications, rooted in cloud technologies, skillfully benefit from...
0
by: abbasky | last post by:
### Vandf component communication method one: data sharing ​ Vandf components can achieve data exchange through data sharing, state sharing, events, and other methods. Vandf's data exchange method...
2
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 7 Feb 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:30 (7.30PM). In this month's session, the creator of the excellent VBE...
0
by: stefan129 | last post by:
Hey forum members, I'm exploring options for SSL certificates for multiple domains. Has anyone had experience with multi-domain SSL certificates? Any recommendations on reliable providers or specific...
0
Git
by: egorbl4 | last post by:
Скачал я git, хотел начать настройку, а там вылезло вот это Что это? Что мне с этим делать? ...
1
by: davi5007 | last post by:
Hi, Basically, I am trying to automate a field named TraceabilityNo into a web page from an access form. I've got the serial held in the variable strSearchString. How can I get this into the...
0
by: DolphinDB | last post by:
The formulas of 101 quantitative trading alphas used by WorldQuant were presented in the paper 101 Formulaic Alphas. However, some formulas are complex, leading to challenges in calculation. Take...
0
by: Aftab Ahmad | last post by:
Hello Experts! I have written a code in MS Access for a cmd called "WhatsApp Message" to open WhatsApp using that very code but the problem is that it gives a popup message everytime I clicked on...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.