473,320 Members | 2,133 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,320 software developers and data experts.

The connection is dead

cj
I left a program running Thursday when I left for the holiday. It was
filling a dataset with just over 3 million records from a table across a
VPN connection. Estimates I made from other large, but not this large,
tables suggested it would take approximately 18 hours to run. I just
checked in on it and it had failed in the fill command. The error
message it gave me was the "connection is dead" Given the length of
time it was to take I don't doubt something happened but nothing should
have happened. Is there anything I can do to make the odbcadapter.fill
less likely to have these problems?
Dec 24 '05 #1
34 3213

maybe a glitch in the line ???
i can`t inmagine why you would like to transfer such a hughe amount of data
, i work also with databases that contain millions of records however i
never encountered a situation when i needed to transfer such amounts in a
dataset

regards

Michel Posseth [MCP]


"cj" <cj@nospam.nospam> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I left a program running Thursday when I left for the holiday. It was
filling a dataset with just over 3 million records from a table across a
VPN connection. Estimates I made from other large, but not this large,
tables suggested it would take approximately 18 hours to run. I just
checked in on it and it had failed in the fill command. The error message
it gave me was the "connection is dead" Given the length of time it was to
take I don't doubt something happened but nothing should have happened. Is
there anything I can do to make the odbcadapter.fill less likely to have
these problems?

Dec 24 '05 #2
cj,

I think this should be normal, I would set the connection.timeout in this
rare case to zero, which means infinity. (Do not do that normally)

http://msdn.microsoft.com/library/de...meouttopic.asp

Cor
Dec 24 '05 #3
cj,

I dont think I would want to load 3 million records at once. I
would load them in blocks. The dataadapter's fill method that has an
overload which allows you to specify the start record and number of records
to load. Here is a simple example that loads data in blocks from an access
database. It will work with the odbcdataadapter also. I did not include
this in the sample but you also could show a progress bar show your status.
Try something like this.

Dim da As OleDbDataAdapter
Dim conn As OleDbConnection
Dim ds As New DataSet
Dim strConn As String
Dim cmd As OleDbCommand
Dim x, numRec As Long

strConn = "Provider = Microsoft.Jet.OLEDB.4.0;"
strConn &= "Data Source = c:\Northwind.mdb;"

conn = New OleDbConnection(strConn)
cmd = New OleDbCommand("Select count(ProductName) From Products",
conn)
Try
da = New OleDbDataAdapter("Select * from Products", conn)

conn.Open()

numRec = CLng(cmd.ExecuteScalar)
conn.Close()

For x = 0 To numRec Step 10
da.Fill(ds, x, 10, "Products")
Next
Catch ex As Exception
Trace.WriteLine(ex.ToString)
End Try

http://msdn.microsoft.com/library/de...filltopic6.asp

Ken
-------------------------
"cj" <cj@nospam.nospam> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I left a program running Thursday when I left for the holiday. It was
filling a dataset with just over 3 million records from a table across a
VPN connection. Estimates I made from other large, but not this large,
tables suggested it would take approximately 18 hours to run. I just
checked in on it and it had failed in the fill command. The error message
it gave me was the "connection is dead" Given the length of time it was to
take I don't doubt something happened but nothing should have happened. Is
there anything I can do to make the odbcadapter.fill less likely to have
these problems?

Dec 24 '05 #4
cj
I'm moving all the tables in a remote informix database to a local sql
server database. Fill via the odbcadapter and update with the
sqladapter was the best way I could come up with. Don't even go there.
You were about to tell me to use the databases migration tools weren't
you :)--can't--don't have time to explain.

Still there's a nice feeling knowing that the computer is spending a lot
of time on this project especially after all the work it's put me
through. Let it work over Christmas. Otherwise it'd be wasted
processing potential anyway.
m.posseth wrote:
maybe a glitch in the line ???
i can`t inmagine why you would like to transfer such a hughe amount of data
, i work also with databases that contain millions of records however i
never encountered a situation when i needed to transfer such amounts in a
dataset

regards

Michel Posseth [MCP]


"cj" <cj@nospam.nospam> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I left a program running Thursday when I left for the holiday. It was
filling a dataset with just over 3 million records from a table across a
VPN connection. Estimates I made from other large, but not this large,
tables suggested it would take approximately 18 hours to run. I just
checked in on it and it had failed in the fill command. The error message
it gave me was the "connection is dead" Given the length of time it was to
take I don't doubt something happened but nothing should have happened. Is
there anything I can do to make the odbcadapter.fill less likely to have
these problems?


Dec 24 '05 #5
cj
Will give it a try. Thanks!
Cor Ligthert [MVP] wrote:
cj,

I think this should be normal, I would set the connection.timeout in this
rare case to zero, which means infinity. (Do not do that normally)

http://msdn.microsoft.com/library/de...meouttopic.asp

Cor

Dec 24 '05 #6
cj
You've got a valid point. One I considered but I decided to try all
records at once anyway. Your suggestion is my backup plan. I do wish
that the data table allowed me to specify a start and # of records to
fill like the dataset--but it doesn't. I'm using a datatable at the
moment but it's not hard to change it to using a dataset. That is the
backup plan. Thanks for the input.
Ken Tucker [MVP] wrote:
cj,

I dont think I would want to load 3 million records at once. I
would load them in blocks. The dataadapter's fill method that has an
overload which allows you to specify the start record and number of records
to load. Here is a simple example that loads data in blocks from an access
database. It will work with the odbcdataadapter also. I did not include
this in the sample but you also could show a progress bar show your status.
Try something like this.

Dim da As OleDbDataAdapter
Dim conn As OleDbConnection
Dim ds As New DataSet
Dim strConn As String
Dim cmd As OleDbCommand
Dim x, numRec As Long

strConn = "Provider = Microsoft.Jet.OLEDB.4.0;"
strConn &= "Data Source = c:\Northwind.mdb;"

conn = New OleDbConnection(strConn)
cmd = New OleDbCommand("Select count(ProductName) From Products",
conn)
Try
da = New OleDbDataAdapter("Select * from Products", conn)

conn.Open()

numRec = CLng(cmd.ExecuteScalar)
conn.Close()

For x = 0 To numRec Step 10
da.Fill(ds, x, 10, "Products")
Next
Catch ex As Exception
Trace.WriteLine(ex.ToString)
End Try

http://msdn.microsoft.com/library/de...filltopic6.asp

Ken
-------------------------
"cj" <cj@nospam.nospam> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I left a program running Thursday when I left for the holiday. It was
filling a dataset with just over 3 million records from a table across a
VPN connection. Estimates I made from other large, but not this large,
tables suggested it would take approximately 18 hours to run. I just
checked in on it and it had failed in the fill command. The error message
it gave me was the "connection is dead" Given the length of time it was to
take I don't doubt something happened but nothing should have happened. Is
there anything I can do to make the odbcadapter.fill less likely to have
these problems?


Dec 24 '05 #7
> You were about to tell me to use the databases migration tools weren't
you :)--can't
Nope ...

I have been there ,, however in my case it was a 7,5 gigabyte mysql dump
database that i needed to import in SQL server 2000
tryed everything to automate this process ,,,, but in the end i succeeded
with a processing time of +- 30 minutes

my solution was to dump the file to a MYSQL dump wich is text based then i
wrote a VB.net program that run through this file and converted the Mysql
dialect to MSSQL Dialect
run the DDL SQL and batch inserted a few hundred records at once to MSSQL .

this was much faster as ODBC connections etc etc that could be made to MYSQL
to MSSQL this process took for this program a whole weekend to complete

maybe this gives you some ideas :-)

regards

Michel Posseth [MCP]
"cj" <cj@nospam.nospam> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl... I'm moving all the tables in a remote informix database to a local sql
server database. Fill via the odbcadapter and update with the sqladapter
was the best way I could come up with. Don't even go there. You were
about to tell me to use the databases migration tools weren't you
:)--can't--don't have time to explain.

Still there's a nice feeling knowing that the computer is spending a lot
of time on this project especially after all the work it's put me through.
Let it work over Christmas. Otherwise it'd be wasted processing potential
anyway.
m.posseth wrote:
maybe a glitch in the line ???
i can`t inmagine why you would like to transfer such a hughe amount of
data , i work also with databases that contain millions of records
however i never encountered a situation when i needed to transfer such
amounts in a dataset

regards

Michel Posseth [MCP]


"cj" <cj@nospam.nospam> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I left a program running Thursday when I left for the holiday. It was
filling a dataset with just over 3 million records from a table across a
VPN connection. Estimates I made from other large, but not this large,
tables suggested it would take approximately 18 hours to run. I just
checked in on it and it had failed in the fill command. The error
message it gave me was the "connection is dead" Given the length of time
it was to take I don't doubt something happened but nothing should have
happened. Is there anything I can do to make the odbcadapter.fill less
likely to have these problems?



Dec 24 '05 #8
Hi,

The informix ado.net class might give better performance than the odbc
class.

http://www-128.ibm.com/developerwork...ity/index.html

Ken
-------------------------

"cj" wrote:
I'm moving all the tables in a remote informix database to a local sql
server database. Fill via the odbcadapter and update with the
sqladapter was the best way I could come up with. Don't even go there.
You were about to tell me to use the databases migration tools weren't
you :)--can't--don't have time to explain.

Still there's a nice feeling knowing that the computer is spending a lot
of time on this project especially after all the work it's put me
through. Let it work over Christmas. Otherwise it'd be wasted
processing potential anyway.
m.posseth wrote:
maybe a glitch in the line ???
i can`t inmagine why you would like to transfer such a hughe amount of data
, i work also with databases that contain millions of records however i
never encountered a situation when i needed to transfer such amounts in a
dataset

regards

Michel Posseth [MCP]


"cj" <cj@nospam.nospam> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I left a program running Thursday when I left for the holiday. It was
filling a dataset with just over 3 million records from a table across a
VPN connection. Estimates I made from other large, but not this large,
tables suggested it would take approximately 18 hours to run. I just
checked in on it and it had failed in the fill command. The error message
it gave me was the "connection is dead" Given the length of time it was to
take I don't doubt something happened but nothing should have happened. Is
there anything I can do to make the odbcadapter.fill less likely to have
these problems?


Dec 24 '05 #9
cj
I added MyOdbcConnection.ConnectionTimeout = 0

4 hours, 53 minutes, 54 seconds and 488,556 records later the fill
aborted with connection dead.

Any ideas before I do as Ken Tucker recommended and rewrite to fill in
increments?
Cor Ligthert [MVP] wrote:
cj,

I think this should be normal, I would set the connection.timeout in this
rare case to zero, which means infinity. (Do not do that normally)

http://msdn.microsoft.com/library/de...meouttopic.asp

Cor

Dec 25 '05 #10
Yea,

At least set your handling in a try block and log when it stops going. At
least there should in my opinion more be showed than only connection dead.

Cor
Dec 25 '05 #11
cj,

Are you sure that the server has not something as a close all connections
build in to do backups or something like that?

Cor
Dec 25 '05 #12
cj
It might but I don't see a pattern that would suggest that.

Here's what I doing when the error occurs. I don't know how to collect
any more data on the event.

Try
MyOdbcAdapter.Fill(MyDt)
Catch ex As Exception
stopTime = Now
StopTLbl.Text = Format(stopTime, "hh:mm:ss:ffff tt")
ElapsedTLbl.Text = DateDiff(DateInterval.Second, tStartTime,
stopTime) & " seconds"
ReadLbl.Text = MyDt.Rows.Count()
MessageBox.Show("Fill error: " & ex.Message, "Copy Aborted.")
Exit Sub
End Try

I think it's time for me to re-write the program to copy the records
like Ken Tucker suggested. I'll probably have a better chance of
loading maybe 1000 records at a time.

I wonder if my computer can even handle building a dataset of 3 million
records. Is it trying to keep them all in memory? That might be the
problem. Of course if it is I would have hoped it would tell me out of
memory or something like that instead of connection dead.

I don't know if you celebrate Christmas or not. If so Merry Christmas.
If not Happy Holidays. I'll probably wait till I get back to the
office on Tuesday to rewrite the copy. Thanks for you help.
Cor Ligthert [MVP] wrote:
cj,

Are you sure that the server has not something as a close all connections
build in to do backups or something like that?

Cor

Dec 25 '05 #13
Hi

From your description, it seems that you are using a odbc database(e.g.
access). Commonly access is used in small solution environment and it is
not a good practice to return such a large records in one query.

I think you may try to use SQL server or other large database solution and
redesign your application to return your query of such a large records.
Commonly I think the 3 million records will be not used in the same time.
If you need to do further operation on the returned records, I suggest you
put the logic in the SQL server side as a Stored Procedure.

Thanks!

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Dec 26 '05 #14
>I wonder if my computer can even handle building a dataset of 3 million
records. Is it trying to keep them all in memory? That might be the
problem. Of course if it is I would have hoped it would tell me out of
memory or something like that instead of connection dead.
probably not ,,,
a while ago we had a nice thread about the maximum size of the string data
type ,,,

http://groups.google.com/group/micro...17429665522020

so i do not know how manny chars your rows contain ,,, but i guess that you
are probably hitting the ceiling ,,,, unless you are running this on a X64
or IA64 system with gigabytes of memory

regards

Michel Posseth [MCP]

"cj" <cj@nospam.nospam> wrote in message
news:O4**************@TK2MSFTNGP14.phx.gbl... It might but I don't see a pattern that would suggest that.

Here's what I doing when the error occurs. I don't know how to collect
any more data on the event.

Try
MyOdbcAdapter.Fill(MyDt)
Catch ex As Exception
stopTime = Now
StopTLbl.Text = Format(stopTime, "hh:mm:ss:ffff tt")
ElapsedTLbl.Text = DateDiff(DateInterval.Second, tStartTime, stopTime) &
" seconds"
ReadLbl.Text = MyDt.Rows.Count()
MessageBox.Show("Fill error: " & ex.Message, "Copy Aborted.")
Exit Sub
End Try

I think it's time for me to re-write the program to copy the records like
Ken Tucker suggested. I'll probably have a better chance of loading maybe
1000 records at a time.

I wonder if my computer can even handle building a dataset of 3 million
records. Is it trying to keep them all in memory? That might be the
problem. Of course if it is I would have hoped it would tell me out of
memory or something like that instead of connection dead.

I don't know if you celebrate Christmas or not. If so Merry Christmas. If
not Happy Holidays. I'll probably wait till I get back to the office on
Tuesday to rewrite the copy. Thanks for you help.
Cor Ligthert [MVP] wrote:
cj,

Are you sure that the server has not something as a close all connections
build in to do backups or something like that?

Cor

Dec 26 '05 #15
cj
I'm moving all the tables in a remote informix database to a local sql
server database. Fill via the odbcadapter and update with the
sqladapter was the best way I could come up with.

Peter Huang [MSFT] wrote:
Hi

From your description, it seems that you are using a odbc database(e.g.
access). Commonly access is used in small solution environment and it is
not a good practice to return such a large records in one query.

I think you may try to use SQL server or other large database solution and
redesign your application to return your query of such a large records.
Commonly I think the 3 million records will be not used in the same time.
If you need to do further operation on the returned records, I suggest you
put the logic in the SQL server side as a Stored Procedure.

Thanks!

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Dec 27 '05 #16
cj
I rewrote the program to copy in blocks of 50 records with a progress
bar to look pretty. My only problem is that copying the 750 record test
table went from 27 seconds to 245 seconds!!!! It might even get worse.
I had opened the connection to the odbc database before checking the
record count and left it open during all all the fill iterations rather
than closing the connection after checking the record count and then
allowing fill to reopen and close it each time. I fear I need to allow
it to close after each fill after all having the connection go "dead"
would seem to be related to how long it stayed open. I'll do some
adjusting. I'll go from 50 records per block to 1000 and see how that
responds. Lost of my tables only have 10 or so records but one has
almost a million, another about 1.5 million and the biggest 3 million.
Any suggestion would be appreciated.

Ken Tucker [MVP] wrote:
cj,

I dont think I would want to load 3 million records at once. I
would load them in blocks. The dataadapter's fill method that has an
overload which allows you to specify the start record and number of records
to load. Here is a simple example that loads data in blocks from an access
database. It will work with the odbcdataadapter also. I did not include
this in the sample but you also could show a progress bar show your status.
Try something like this.

Dim da As OleDbDataAdapter
Dim conn As OleDbConnection
Dim ds As New DataSet
Dim strConn As String
Dim cmd As OleDbCommand
Dim x, numRec As Long

strConn = "Provider = Microsoft.Jet.OLEDB.4.0;"
strConn &= "Data Source = c:\Northwind.mdb;"

conn = New OleDbConnection(strConn)
cmd = New OleDbCommand("Select count(ProductName) From Products",
conn)
Try
da = New OleDbDataAdapter("Select * from Products", conn)

conn.Open()

numRec = CLng(cmd.ExecuteScalar)
conn.Close()

For x = 0 To numRec Step 10
da.Fill(ds, x, 10, "Products")
Next
Catch ex As Exception
Trace.WriteLine(ex.ToString)
End Try

http://msdn.microsoft.com/library/de...filltopic6.asp

Ken
-------------------------
"cj" <cj@nospam.nospam> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I left a program running Thursday when I left for the holiday. It was
filling a dataset with just over 3 million records from a table across a
VPN connection. Estimates I made from other large, but not this large,
tables suggested it would take approximately 18 hours to run. I just
checked in on it and it had failed in the fill command. The error message
it gave me was the "connection is dead" Given the length of time it was to
take I don't doubt something happened but nothing should have happened. Is
there anything I can do to make the odbcadapter.fill less likely to have
these problems?


Dec 27 '05 #17
Hi

Thanks for your quickly update!
If you can migrate all the data into SQL server, we can use SQL adapter.
Also as I said before, it would better redesign your program, it is not a
good practice to return such a large records in one query.

Also in ADO.NET 2.0, we introduced some new features which may help you.
Asynchronous Commands
ADO.NET 2.0 Feature Matrix
http://msdn.microsoft.com/library/de...us/dnvs05/html
/ado2featurematrix.asp

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Dec 28 '05 #18
CJ,

Why are you so sticked to that ODBC, don't you have a OLEDB provider for
that database.

Cor
Dec 28 '05 #19
cj
I'm sorry I think your getting confused. THIS PROGRAM IS BEING USED TO
MIGRATE THE DATA from Informix to SQL Server. And before you suggest
it, I can NOT use any informix db tools or any other tool that would
require loading something or running something on the informix server,
and I don't have time to explain why.

Peter Huang [MSFT] wrote:
Hi

Thanks for your quickly update!
If you can migrate all the data into SQL server, we can use SQL adapter.
Also as I said before, it would better redesign your program, it is not a
good practice to return such a large records in one query.

Also in ADO.NET 2.0, we introduced some new features which may help you.
Asynchronous Commands
ADO.NET 2.0 Feature Matrix
http://msdn.microsoft.com/library/de...us/dnvs05/html
/ado2featurematrix.asp

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Dec 28 '05 #20
cj
Configuring OLEDB access to the Informix Server would require additional
setup on the Informix Server. Corporate politics, is the short answer
as to why that will not happen.

If I can read 10 records, I should be able to read 3 million. I tried
last night reading 1000 at a time. The connection closed between each
fill. at 17000 records I got "Connection Dead".
Cor Ligthert [MVP] wrote:
CJ,

Why are you so sticked to that ODBC, don't you have a OLEDB provider for
that database.

Cor

Dec 28 '05 #21
cj
I've added my own looping around the try catch structure. I give it 50
trys with a 4 second break between trys. To keep speed high I batch
50,000 records at a time. I am hopefull by tomorrow at this time the
entire 3 million will have been moved.

Ken Tucker [MVP] wrote:
cj,

I dont think I would want to load 3 million records at once. I
would load them in blocks. The dataadapter's fill method that has an
overload which allows you to specify the start record and number of records
to load. Here is a simple example that loads data in blocks from an access
database. It will work with the odbcdataadapter also. I did not include
this in the sample but you also could show a progress bar show your status.
Try something like this.

Dim da As OleDbDataAdapter
Dim conn As OleDbConnection
Dim ds As New DataSet
Dim strConn As String
Dim cmd As OleDbCommand
Dim x, numRec As Long

strConn = "Provider = Microsoft.Jet.OLEDB.4.0;"
strConn &= "Data Source = c:\Northwind.mdb;"

conn = New OleDbConnection(strConn)
cmd = New OleDbCommand("Select count(ProductName) From Products",
conn)
Try
da = New OleDbDataAdapter("Select * from Products", conn)

conn.Open()

numRec = CLng(cmd.ExecuteScalar)
conn.Close()

For x = 0 To numRec Step 10
da.Fill(ds, x, 10, "Products")
Next
Catch ex As Exception
Trace.WriteLine(ex.ToString)
End Try

http://msdn.microsoft.com/library/de...filltopic6.asp

Ken
-------------------------
"cj" <cj@nospam.nospam> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I left a program running Thursday when I left for the holiday. It was
filling a dataset with just over 3 million records from a table across a
VPN connection. Estimates I made from other large, but not this large,
tables suggested it would take approximately 18 hours to run. I just
checked in on it and it had failed in the fill command. The error message
it gave me was the "connection is dead" Given the length of time it was to
take I don't doubt something happened but nothing should have happened. Is
there anything I can do to make the odbcadapter.fill less likely to have
these problems?


Dec 28 '05 #22
Hi

If you just want to migrate the data from informix to SQL server, you can
use SQL server's DTS feature. The DTS package is running at SQL server side.
DTS Import/Export Wizard
http://msdn.microsoft.com/library/de...us/dtssql/dts_
tools_wiz_8vsj.asp

For detailed information about SQL DTS, you can post in the newsgroup below.
microsoft.public.sqlserver.dts

Thanks for your understanding!

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Dec 29 '05 #23

"cj" <cj@nospam.nospam> wrote in message
news:Ox**************@TK2MSFTNGP15.phx.gbl...
I'm sorry I think your getting confused. THIS PROGRAM IS BEING USED TO
MIGRATE THE DATA from Informix to SQL Server. And before you suggest it,
I can NOT use any informix db tools or any other tool that would require
loading something or running something on the informix server, and I don't
have time to explain why.


Dump Informix data to an ASCII text file. Use QBasic to rewrite that file to
one that is compatible with SQL Server. Import.

I've made money many times doing just that. You'll save more time by using
QBasic than by trying to get a working program running using VB.

Dec 30 '05 #24

"cj" <cj@nospam.nospam> wrote in message
news:u0**************@TK2MSFTNGP09.phx.gbl...
Configuring OLEDB access to the Informix Server would require additional
setup on the Informix Server. Corporate politics, is the short answer as
to why that will not happen.

If I can read 10 records, I should be able to read 3 million. I tried
last night reading 1000 at a time. The connection closed between each
fill. at 17000 records I got "Connection Dead".


Dump 15000 at a time. Repeat.

Dec 30 '05 #25
cj
I do not have access to the informix db software to do anything like that.

Homer J Simpson wrote:
"cj" <cj@nospam.nospam> wrote in message
news:Ox**************@TK2MSFTNGP15.phx.gbl...

I'm sorry I think your getting confused. THIS PROGRAM IS BEING USED TO
MIGRATE THE DATA from Informix to SQL Server. And before you suggest it,
I can NOT use any informix db tools or any other tool that would require
loading something or running something on the informix server, and I don't
have time to explain why.

Dump Informix data to an ASCII text file. Use QBasic to rewrite that file to
one that is compatible with SQL Server. Import.

I've made money many times doing just that. You'll save more time by using
QBasic than by trying to get a working program running using VB.

Dec 30 '05 #26
cj
Interesting. I'll look into it.

Peter Huang [MSFT] wrote:
Hi

If you just want to migrate the data from informix to SQL server, you can
use SQL server's DTS feature. The DTS package is running at SQL server side.
DTS Import/Export Wizard
http://msdn.microsoft.com/library/de...us/dtssql/dts_
tools_wiz_8vsj.asp

For detailed information about SQL DTS, you can post in the newsgroup below.
microsoft.public.sqlserver.dts

Thanks for your understanding!

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Dec 30 '05 #27

"cj" <cj@nospam.nospam> wrote in message
news:uo**************@TK2MSFTNGP11.phx.gbl...
I do not have access to the informix db software to do anything like that.


You only need access to the data, not the code. If you have no access to the
data, how can you migrate it?

Dec 31 '05 #28
Hi ,

I appreciate your prompt response. : )

I understand your situation and please take your time to perform the steps
that I have provided you. If there is anything unclear, please feel free to
post back. I am always very happy to be of assistance.

Happy New Year!

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Jan 3 '06 #29
cj
Perhaps I'm missing something, how do I dump the tables into an ascii file?
Homer J Simpson wrote:
"cj" <cj@nospam.nospam> wrote in message
news:uo**************@TK2MSFTNGP11.phx.gbl...

I do not have access to the informix db software to do anything like that.

You only need access to the data, not the code. If you have no access to the
data, how can you migrate it?

Jan 3 '06 #30
cj
Peter,

is DTS part of SQL Server 2000?
Peter Huang [MSFT] wrote:
Hi ,

I appreciate your prompt response. : )

I understand your situation and please take your time to perform the steps
that I have provided you. If there is anything unclear, please feel free to
post back. I am always very happy to be of assistance.

Happy New Year!

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Jan 3 '06 #31
cj
I didn't get your message till now. I left it Friday requesting 10,000
records at a time. Given my loop code if any .fill failed the program
pauses for 10 seconds and then requests runs .fill again. If it tries
more than 500 times with out success the program aborts. I keep a
cumulative counter of times the program had to retry a fill. This AM I
had only gotten 320,000 records and the looping had to execute 20 times.
This is ultra slow retrieval and has lead me to wonder about something
I'd seen in a test Friday. Take right now for instance. The program is
requesting .Fill(MyDs, 320000, 10000, currentTable). It seems to take
it much, much longer to get this than if it requested .Fill(MyDs, 0,
10000, currentTable). Do you suppose it is reading through the database
sequentially up to 320000 in order to start? Is the fill getting
incrementally slower each time? I'm at a loss to explain how I can get
the first 60,000 records in one fill statement in 20 minutes but it's
taken almost 4 days to retrieve 320,000 records with my 10,000 at a time
filling (even given it had to retry blocks of 10,000 some 20 times over
that time.)

Homer J Simpson wrote:
"cj" <cj@nospam.nospam> wrote in message
news:u0**************@TK2MSFTNGP09.phx.gbl...

Configuring OLEDB access to the Informix Server would require additional
setup on the Informix Server. Corporate politics, is the short answer as
to why that will not happen.

If I can read 10 records, I should be able to read 3 million. I tried
last night reading 1000 at a time. The connection closed between each
fill. at 17000 records I got "Connection Dead".

Dump 15000 at a time. Repeat.

Jan 3 '06 #32
Hi

Yes, it is shipped with SQL Server 2000 but not with the MSDE version.

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Jan 4 '06 #33
cj
Peter,

I have the msdn cd that says it has: SQL Server 2000 Developer Edition
and SQL Server 2000 SP4 on it. MSDE = the Developer Edition, Correct?
Do you think the SQL Server 2000 SP4 is just a patched Developer Edition
or is it the full SQL Server?

I currently have SQL Express installed on my PC but I am sending this
data to a SQL Server machine that someone else here administers.
Peter Huang [MSFT] wrote:
Hi

Yes, it is shipped with SQL Server 2000 but not with the MSDE version.

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Jan 4 '06 #34
Hi

Based on my knowledge, SQL Server 2000 Developer Edition is not equal to
MSDE, it should have a dts inside.
And the SQL Server sp4 is just a patch.

Because I am not expert on SQL server, so I suggest post in the SQL server
newsgroup. So that many SQL expert will help you/
e.g.
microsoft.public.sqlserver.dts

Thanks for your understanding!

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Jan 5 '06 #35

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: Alex Smith | last post by:
Hi all, I have an application that uses Oracle's JDBC thin driver 9.0.x to have a nice, friendly chat with 8.1.x database. During this exchange the server rudely interrupts the conversation and...
1
by: David Gamble | last post by:
I am having problems with long timeouts when connecting to or executing commands in SQL Server 2000 SP3. I am writing an application that will be used on laptops within an environment that is...
2
by: meng | last post by:
is it possible to use a single connection object shared by several tasks where each task is handled by a thread? these tasks call stored procedures that return record sets, no editing, update or...
2
by: the6campbells | last post by:
Vendors such as ORACLE document how they support the server being configured to provide dead connection detection. I see no documentation in the IBM manuals or support site notes that suggest that...
0
by: John D. | last post by:
I am having problems inserting Decimal values into a DB2 database via a .NET/C# application which is using an OdbcDataAdapter. Other field types such as VarChar, Int, Date, etc work ok, and I can...
1
by: Pradeep | last post by:
DB2ers, Our Windows application connects to any DB2 server using the ODBC driver. It seems there are two ODBC drivers that IBM provides - DB Connect and DB2 Runtime Client. We have tested our...
20
by: fniles | last post by:
I am using VS2003 and connecting to MS Access database. When using a connection pooling (every time I open the OLEDBCONNECTION I use the exact matching connection string), 1. how can I know how...
2
by: Nick Toop | last post by:
Hi, I have a Java application which runs a server socket on a PC. Various sensors (using microprocessors) can call it up and each gets its own thread running the socket connection. The...
0
by: Xionbox | last post by:
Hello everybody, The error I have seems very easy to solve, but for some odd reason I can't seem to solve it. Anyways, here's my "setup". I created a server running on localhost:1200 (telnet...
0
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
0
by: ryjfgjl | last post by:
ExcelToDatabase: batch import excel into database automatically...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
1
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
0
by: Vimpel783 | last post by:
Hello! Guys, I found this code on the Internet, but I need to modify it a little. It works well, the problem is this: Data is sent from only one cell, in this case B5, but it is necessary that data...
0
by: Defcon1945 | last post by:
I'm trying to learn Python using Pycharm but import shutil doesn't work
1
by: Shællîpôpï 09 | last post by:
If u are using a keypad phone, how do u turn on JavaScript, to access features like WhatsApp, Facebook, Instagram....
0
by: af34tf | last post by:
Hi Guys, I have a domain whose name is BytesLimited.com, and I want to sell it. Does anyone know about platforms that allow me to list my domain in auction for free. Thank you
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.