473,747 Members | 2,886 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

V9 Load from Cursor and multiple DB's

I am looking for comments on experience using a Load from Cursor
across multiple db's databases.
I have a multi-terrabyte database across many partitions that includes
a large table (1 Tb+). The system also contains UTF-8 and LOB data.
I am about to refresh the existing platform - going from V8.2 to V9
and leveraging new hardware. I an staying with SuSe Linux at 64-bit.
Does anyone have experience using this?

My initial plan was to export from OLD_DB in parallel to a local file,
then ftp to NEW_DB, followed by load. I am thinking now that the new
capabilities of Load from CURSOR DATABASE will provide reasonable
performance and ease of implementation. Not sure about bogging down
the coordinator on OLD_DB.

Are there any known limitations that I should be aware of ? How about
relative performance of these two methodologies?

Is it ok that NEW_DB will be V9 and the OLD_DB will be V8.2?

Thanks for your input.

Mar 6 '07 #1
5 3266
On Mar 6, 12:18 pm, "mike_dba" <michaelaaldr.. .@yahoo.comwrot e:
I am looking for comments on experience using a Load from Cursor
across multiple db's databases.
I have a multi-terrabyte database across many partitions that includes
a large table (1 Tb+). The system also contains UTF-8 and LOB data.
I am about to refresh the existing platform - going from V8.2 to V9
and leveraging new hardware. I an staying with SuSe Linux at 64-bit.
Does anyone have experience using this?

My initial plan was to export from OLD_DB in parallel to a local file,
then ftp to NEW_DB, followed by load. I am thinking now that the new
capabilities of Load from CURSOR DATABASE will provide reasonable
performance and ease of implementation. Not sure about bogging down
the coordinator on OLD_DB.

Are there any known limitations that I should be aware of ? How about
relative performance of these two methodologies?

Is it ok that NEW_DB will be V9 and the OLD_DB will be V8.2?

Thanks for your input.
Mike:

Yes, we use Cross Load a lot and are very pleased with it. Not to
mentioned the DASD saved to store the staging files and the efforts to
transfer them. If the load failed, it is simple, just declare the
cursor and restart the load again.
The version of the DB does not matter, V9 can reference V8 DB.

If your network is not an issue, expect 10-15% impact on elapsed time
vs loading locally.

ps: It is extremely useful when you use Cross Load with Information
Integrator from a 3rd party Relational DB or MF DB2.

Regards,
Eric

Mar 7 '07 #2
On 3/6/2007 at 7:06 PM, in message
<11************ **********@h3g2 000cwc.googlegr oups.com>, Eric
K<rd****@yahoo. comwrote:
On Mar 6, 12:18 pm, "mike_dba" <michaelaaldr.. .@yahoo.comwrot e:
>I am looking for comments on experience using a Load from Cursor
across multiple db's databases.
I have a multi-terrabyte database across many partitions that includes
a large table (1 Tb+). The system also contains UTF-8 and LOB data.
I am about to refresh the existing platform - going from V8.2 to V9
and leveraging new hardware. I an staying with SuSe Linux at 64-bit.
Does anyone have experience using this?

My initial plan was to export from OLD_DB in parallel to a local file,
then ftp to NEW_DB, followed by load. I am thinking now that the new
capabilities of Load from CURSOR DATABASE will provide reasonable
performance and ease of implementation. Not sure about bogging down
the coordinator on OLD_DB.

Are there any known limitations that I should be aware of ? How about
relative performance of these two methodologies?

Is it ok that NEW_DB will be V9 and the OLD_DB will be V8.2?

Thanks for your input.

Mike:

Yes, we use Cross Load a lot and are very pleased with it. Not to
mentioned the DASD saved to store the staging files and the efforts to
transfer them. If the load failed, it is simple, just declare the
cursor and restart the load again.
The version of the DB does not matter, V9 can reference V8 DB.

If your network is not an issue, expect 10-15% impact on elapsed time
vs loading locally.

ps: It is extremely useful when you use Cross Load with Information
Integrator from a 3rd party Relational DB or MF DB2.
Is this referring to the item in the "What's New" for V9 manual under "Load
from cursor with remote fetch"? Looks interesting!

My V9 database is not up at the moment so I can't test it. Anyway, I found
the reference to it in the Data Movement Utilities and Guide Reference, but
it doesn't look like SQL Reference, Vol. 2 for DECLARE CURSOR has been
updated with the new options. Or am I just blind?

Thanks,
Frank
---
Frank Swarbrick
Senior Developer/Analyst - Mainframe Applications
FirstBank Data Corporation - Lakewood, CO USA
Mar 7 '07 #3
On Mar 7, 11:56 am, "Frank Swarbrick" <Frank.Swarbr.. .@efirstbank.co m>
wrote:
On 3/6/2007 at 7:06 PM, in message
<1173233178.574 549.227...@h3g2 000cwc.googlegr oups.com>, Eric

K<rdb...@yahoo. comwrote:
On Mar 6, 12:18 pm, "mike_dba" <michaelaaldr.. .@yahoo.comwrot e:
I am looking for comments on experience using a Load from Cursor
across multiple db's databases.
I have a multi-terrabyte database across many partitions that includes
a large table (1 Tb+). The system also contains UTF-8 and LOB data.
I am about to refresh the existing platform - going from V8.2 to V9
and leveraging new hardware. I an staying with SuSe Linux at 64-bit.
Does anyone have experience using this?
My initial plan was to export from OLD_DB in parallel to a local file,
then ftp to NEW_DB, followed by load. I am thinking now that the new
capabilities of Load from CURSOR DATABASE will provide reasonable
performance and ease of implementation. Not sure about bogging down
the coordinator on OLD_DB.
Are there any known limitations that I should be aware of ? How about
relative performance of these two methodologies?
Is it ok that NEW_DB will be V9 and the OLD_DB will be V8.2?
Thanks for your input.
Mike:
Yes, we use Cross Load a lot and are very pleased with it. Not to
mentioned the DASD saved to store the staging files and the efforts to
transfer them. If the load failed, it is simple, just declare the
cursor and restart the load again.
The version of the DB does not matter, V9 can reference V8 DB.
If your network is not an issue, expect 10-15% impact on elapsed time
vs loading locally.
ps: It is extremely useful when you use Cross Load with Information
Integrator from a 3rd party Relational DB or MF DB2.

Is this referring to the item in the "What's New" for V9 manual under "Load
from cursor with remote fetch"? Looks interesting!

My V9 database is not up at the moment so I can't test it. Anyway, I found
the reference to it in the Data Movement Utilities and Guide Reference, but
it doesn't look like SQL Reference, Vol. 2 for DECLARE CURSOR has been
updated with the new options. Or am I just blind?

Thanks,
Frank

---
Frank Swarbrick
Senior Developer/Analyst - Mainframe Applications
FirstBank Data Corporation - Lakewood, CO USA- Hide quoted text -

- Show quoted text -
Yes, that is where we saw it as well - "What's New". We have done
some testing and it works nicely. But we tested only with small
files. Our big tables and LOB and UTF-8 data gives us pause. Perhaps
a refined cursor declaration (say by time period) and performing it
iteratively might provide better restartability for us. Network
latency is always a variable as Eric pointed out.

Mar 7 '07 #4
Ian
mike_dba wrote:
I am looking for comments on experience using a Load from Cursor
across multiple db's databases.
I have a multi-terrabyte database across many partitions that includes
a large table (1 Tb+). The system also contains UTF-8 and LOB data.
I am about to refresh the existing platform - going from V8.2 to V9
and leveraging new hardware. I an staying with SuSe Linux at 64-bit.
Does anyone have experience using this?
Sorry if this is a silly question, but why don't you just backup the
database and then restore it on the new servers? RESTORE will do the
migration for you, automatically.

Obviously this requires that you have enough space to hold the all of
the backup images, but this will likely be faster than export/load.

Mar 7 '07 #5
On Mar 7, 4:44 pm, Ian <ianb...@mobile audio.comwrote:
mike_dba wrote:
I am looking for comments on experience using a Load from Cursor
across multiple db's databases.
I have a multi-terrabyte database across many partitions that includes
a large table (1 Tb+). The system also contains UTF-8 and LOB data.
I am about to refresh the existing platform - going from V8.2 to V9
and leveraging new hardware. I an staying with SuSe Linux at 64-bit.
Does anyone have experience using this?

Sorry if this is a silly question, but why don't you just backup the
database and then restore it on the new servers? RESTORE will do the
migration for you, automatically.

Obviously this requires that you have enough space to hold the all of
the backup images, but this will likely be faster than export/load.
Not a silly question at all. The target system has numerous changes -
# partitons, DDL changes for things like # containers, MDC, etc.

Mar 7 '07 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

7
10574
by: Brian Kelley | last post by:
I am trying to use threads and mysqldb to retrieve data from multiple asynchronous queries. My basic strategy is as follows, create two cursors, attach them to the appropriate databases and then spawn worker functions to execute sql queries and process the results. This works occasionally, but fails a lot taking python down with it. Sometimes it also loses connection to the database. Sometimes I get an error, "Commands out of sync; ...
4
3314
by: nightmarch | last post by:
I want use crsr.nextset() , But I got errors like following: >>> connStr = "Provider=MSDAORA.1;Password=jmpower;User ID=jmpower;Data Source=jmgis_agps3;Persist Security Info=True" >>> import adodbapi >>> conn= adodbapi.connect( connStr ) >>> crsr = conn.cursor() >>> sql = "select * from wjtmp" >>> crsr.execute(sql)
8
3570
by: DB2 Novice | last post by:
I am trying to use DB2 Control Centre (version 8.2) to load one flat file into multiple tables. However, I don't see the options in Control Centre that allows that. Anyone knows how to do this? DB2 Novice
7
3782
by: jane | last post by:
I'm going to use cursor load to load 200GB data in my production database. My database has 2 partitions. but I cannot find more info in the manual about this cursor load. I'm concern about loading 200GB , this large amount of data, is there any limitation?
7
6570
by: P. Adhia | last post by:
Sorry for quoting an old post and probably I am reading out of context so my concern is unfounded. But I would appreciate if I can get someone or Serge to confirm. Also unlike the question asked in the post below, my question involves non-partitioned table loads. I want to know if, in general, loading from cursor is slower than loading from a file? I was thinking cursor would normally be faster, because of DB2's superior buffer/prefetch...
8
18819
by: johnlichtenstein | last post by:
I am using cx_Oracle and MySQLdb to pull a lot of data from some tables and I find that the cursor.execute method uses a lot of memory that never gets garbage collected. Using fetchmany instead of fetchall does not seem to make any difference, since it's the execute that uses memory. Breaking the query down to build lots of small tables doesn't help, since execute doesn't give its memory back, after reading enough small tables execute...
12
31005
by: Lucky | last post by:
Hi guys! i want to create one cursor in the t-sql. the problem is i want to use stored procedure instead of select command in cursor. can anyone tell me how can i use stored procedure's o/p to create cursor? i'm using sql 2000 and .net 2.0 thanks,
1
3824
by: dbagirltx | last post by:
We have done some testing with mixed and forgotten results. So I'm hoping that asking here can clarify some issues for us. Right now we do one weekly warm backup. Throughout the week there are multiple unrecoverable loads. We are tyring to come up with the best backup strategy for this system? It is a large dev data warehouse. What happens when a table is loaded unrecoverable and then we need to restore (no DDL has been done)? Can we...
0
1031
by: tedpottel | last post by:
Hi I am trying to right a script to keep a local copy of my mysql database in a local access file. I was able to do this in Access visual basic, but cannot get it to work in python. The procedure works by storing the last index value after each update. Then I do a quarry for all records with a index value bigger the the last one stored.
0
8818
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
9522
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
9354
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
9223
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
6067
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4860
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3296
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
2771
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2203
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.