By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
429,384 Members | 671 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 429,384 IT Pros & Developers. It's quick & easy.

Memory Issues in VB

P: 46
Hi all,

I am stuck in VB once again. I think, VB is something about which you cann't say at anytime for sure that I have mastered it completely. Anyways, my issue is stated below:

I am making a program in VB which will
  • Fetch the records from a table of an Oracle 9i database.
  • Table has got a column called amount and another called product_code. I have to add up the amounts for different products from every record depending upon some condition (business logic).
  • Display the results in an MSHFlexGrid

I am actually making a balance calculation program for my bank (of which I am an employee). The table from which I am fetching records contains atleast 50-60 million records with, ofcourse, properly indexed fields. The program starts summing up the amount one by one appropriating to credit or debit.

Problem lies in the second step above. When the progarm has fetched, say 2-2.5 lakh records, it makes my system extremely slow. After some time, I get a message saying that "Not enough memory to carry out this operation". Debugging the operation to which it points reveals that it is an rs.movenext statement. And that's where I have got stuck.

I know that I can make a complex query to directly fetch the results in the format I want, instead of fetching all the relevent records in VB. But it has got two disadvantages:

  • The query takes exceedingly long (not less than 10 minutes) to complete because its a huge database which is live in nature. This results in an annoyed user.
  • The program itself becomes unstable and non responding when that complex query is sent to database for execution. DoEvents cannot help here because it is the single query statement where it stucks and not a loop.

I hope you have got what actually the problem is.

Thanks in anticipation.
Jan 14 '08 #1
Share this Question
Share on Google+
28 Replies


QVeen72
Expert 100+
P: 1,445
Hi,

With such huge databases, what I have found is "Stored Procedures" are the answer. Instead of passing some ad-hoc queries to the server, just create a stored procedure, pass the parameter, and make it to return the Recordset.
Since SP's are compiled, the return results are faster.

Regards
Veena
Jan 14 '08 #2

P: 46
Hi,

With such huge databases, what I have found is "Stored Procedures" are the answer. Instead of passing some ad-hoc queries to the server, just create a stored procedure, pass the parameter, and make it to return the Recordset.
Since SP's are compiled, the return results are faster.

Regards
Veena

I have tried it this way also. It also has the same problem. The time during which strored procedure is getting executed, the user at front-end gets no idea of what is going on in the background and how much percentage of his his job is completed and how much is remaining.
Jan 14 '08 #3

QVeen72
Expert 100+
P: 1,445
Hi,

It sure is a Memory related problem.
Check by Increasing Virtual memory of the system..
With SP also you get the same memory problem...?

Regards
Veena
Jan 14 '08 #4

Expert 5K+
P: 8,434
A few thoughts come to mind. Excuse my rambling, but perhaps it will spark something useful...
  • When you do the "complex query" that takes too long, is the long-running processing being done on your PC or the server? Or are they the same machine? Can you hand off the processing to the server? It may be faster, and it certainly has access to the data without transmitting it all across the network.
  • Could you reduce the size of the recordset by eliminating unnecessary fields from the query?
  • Could you perhaps do some sort of aggregate query to return a bunch of categories, then do a separate query to pull the full data for each category?
  • How far is it getting before the memory error? You told us roughly where it slows down, but not where it crashes.
  • What version of VB are you using, and how are you talking to the database? I am using ten-year-old VB6 to read millions of records without major memory problems, though I have had to change tactics at times to avoid them.
  • Perhaps you can simply process the records in ranges. For example, if you know a unique key, include a WHERE clause to return only the first "lakh" (I had to go and look that up). Put this inside a loop, and process each chunk of records until you reach the end, releasing the memory from each recordset before opening the next.
  • Why is your user not annoyed when the long wait happens in your code rather than during a single database call? If it's because you keep them entertained with some sort of "processing - please wait" display, then maybe there's a way you could provide a similar feature but still leave the server to do the work.
  • How much RAM do you have? Could the answer possibly be as simple as a minor PC upgrade? I doubt it (more likely that would just delay the problem), but you never know.
  • Any finally - I'd recommend you ask the same question in the Access forum, as they deal with this kind of thing more. I don't think the issue is anything specific to VB, but is about finding a more efficient (in time and/or memory) method of accessing the required data.
Jan 15 '08 #5

P: 46
Thanks Killer, for the thoughts you have put. But unfortunately, nothing could spark here. I will tell you one by one:
  • When you do the "complex query" that takes too long, is the long-running processing being done on your PC or the server? Or are they the same machine? Can you hand off the processing to the server? It may be faster, and it certainly has access to the data without transmitting it all across the network.
    I am using an Oracle 9i server which is installed on a seperate and very powerful database server exclusively meant for this purpose only. All the queries and database related operations are carried out on that server only. Its not my PC where it gets done.

  • Could you reduce the size of the recordset by eliminating unnecessary fields from the query?
    I am taking only the fields which are indeed necessary

  • Could you perhaps do some sort of aggregate query to return a bunch of categories, then do a separate query to pull the full data for each category?
    This will take even longer.
    How far is it getting before the memory error? You told us roughly where it slows down, but not where it crashes.
    It crashes after around 2.9 lakh records.

  • What version of VB are you using, and how are you talking to the database? I am using ten-year-old VB6 to read millions of records without major memory problems, though I have had to change tactics at times to avoid them.
    I am using Microsoft Visual Basic 6.0. What are the tactics you shift to, to avoid such problems. That's exactly where I am interested.

  • Perhaps you can simply process the records in ranges. For example, if you know a unique key, include a WHERE clause to return only the first "lakh" (I had to go and look that up). Put this inside a loop, and process each chunk of records until you reach the end, releasing the memory from each recordset before opening the next.
    First of all, I am not doing any sort of processing. I am simply reading the amount and adding/subtratction them depending on some business logic. I can atmost do the partition of recordset on date of transaction which is included in unique key but is at the 5th place in a seven fields long unique key.

    Here, I would like to tell you that I am not supposed/allowed to make any changes in the structure of the table or its indexes. Database is a very sensitive one and is live in nature. I feel, that is where difference lies in working with small PC based databases which you can change at any time at your will, and working with live, huge, industry based databases.

  • Why is your user not annoyed when the long wait happens in your code rather than during a single database call? If it's because you keep them entertained with some sort of "processing - please wait" display, then maybe there's a way you could provide a similar feature but still leave the server to do the work.
    Yes, I do keep user entertained while balances are being calculated, with showing them the effect of every record being scanned. Its a part of the requirements. Also, as I earlier told you that I not doing any major work in my program. Its simply fetch a record, do an addition or subraction, update the balance shown in MSHFlexgrid and then move on to the next record.

    I dont think it is too huge a processing burden.

  • How much RAM do you have? Could the answer possibly be as simple as a minor PC upgrade? I doubt it (more likely that would just delay the problem), but you never know.
    512 MB of RAM with a 3 GHz P4 Processor. I think, configuration is more than enough.

  • Any finally - I'd recommend you ask the same question in the Access forum, as they deal with this kind of thing more. I don't think the issue is anything specific to VB, but is about finding a more efficient (in time and/or memory) method of accessing the required data.
    Dear, Access is no where in the picture. My front-end is VB and back-end is Oracle 9i. My assesment is that issue is very much specific to VB only because at SQL prompt my query does get executed without any memory related problems, takes too long but that's a different thing.


I think, the solution lies in freeing up the memory used by every record of the recordset as early as it is scanned and not required any more by the program. But cann't get a way to do it.
Jan 15 '08 #6

Expert 5K+
P: 8,434
...
I am using an Oracle 9i server which is installed on a seperate and very powerful database server exclusively meant for this purpose only. All the queries and database related operations are carried out on that server only. Its not my PC where it gets done.
Ok, just wanted to double-check that. This sort of database connectivity is not my area of expertise, so feel free to ignore me. Just brainstorming.

I am taking only the fields which are indeed necessary
Well, I suppose that is mixed news. Good that it's being done efficiently. But also bad in a way, as it might have been an "easy win". :)

Could you perhaps do some sort of aggregate query to return a bunch of categories, then do a separate query to pull the full data for each category?
This will take even longer.
Very probably. This idea was merely about getting it to work, not getting it to work faster.

It crashes after around 2.9 lakh records.
Ok.

I am using Microsoft Visual Basic 6.0. What are the tactics you shift to, to avoid such problems. That's exactly where I am interested.
Unfortunately, it may take some time to work out an answer to that one. I wouldnít say I really understood what I was doing. I was jumping around trying ideas proposed by people in the Access forum, and trying things out for myself, and eventually managed to get things working reasonably well. Since it was for a utility that I use myself, I wasnít too concerned about documenting the process or anything. Though obviously I should have done something of the sort. And as you mention later, it was on a little, local Access database. "Little" in this case being probably on the order of 60 million records, and around 7-8GB.

Iím probably just parroting my earlier post now, but one thing which comes to mind is the possibility of simply closing the recordset after, say, each 150K records. Then reopening it excluding those already processed. I realise this is an ugly workaround, but it might be worth keeping in mind if you donít turn up something better.

First of all, I am not doing any sort of processing. I am simply reading the amount and adding/subtratction them depending on some business logic. I can atmost do the partition of recordset on date of transaction which is included in unique key but is at the 5th place in a seven fields long unique key.
Well if thatís not processing, Iíd like to know what is. :) But I see your point, the processing on the PC is certainly minimal.

Here, I would like to tell you that I am not supposed/allowed to make any changes in the structure of the table or its indexes. Database is a very sensitive one and is live in nature. I feel, that is where difference lies in working with small PC based databases which you can change at any time at your will, and working with live, huge, industry based databases.
True. However, I wasnít proposing any changes to the database. (I admit, a lot of the performance tweaking for my application would have been in the database structure.

Yes, I do keep user entertained while balances are being calculated, with showing them the effect of every record being scanned. Its a part of the requirements. Also, as I earlier told you that I not doing any major work in my program. Its simply fetch a record, do an addition or subraction, update the balance shown in MSHFlexgrid and then move on to the next record.

I dont think it is too huge a processing burden.
If it's in the requirements, then I guess thatís that. I was just waffling about offloading work to the server again. Ignore this point. There might be ways to reclaim a tiny bit of lost processing time in how the user display is done. But even if true, weighed against the data retrieval it would certainly be negligible.

512 MB of RAM with a 3 GHz P4 Processor. I think, configuration is more than enough.
Well, that depends on what youíre doing. You might want to have a look at the amount of RAM being used before and during your process, and how quickly itís increasing. In this day and age, even 1GB of RAM is not huge. Half a GB might be considered a little light, depending on what youíre doing with it. (In my experience, if you were using IBM Websphere for development, you might be screaming for at least double what youíve got. Not relevant, I know - merely for illustrative purposes).

Dear, Access is no where in the picture. My front-end is VB and back-end is Oracle 9i. My assesment is that issue is very much specific to VB only because at SQL prompt my query does get executed without any memory related problems, takes too long but that's a different thing.
I realise that. The reason I singled out the Access forum is that the ďseniorĒ people there have a ton of database expertise. This relates not only to Access, but also SQL Server, Oracle, SQL processing in general, and database connectivity from Visual Basic. I still believe you'll find more help there than in the VB forum, though obviously I could be mistaken.

I think, the solution lies in freeing up the memory used by every record of the recordset as early as it is scanned and not required any more by the program. But cann't get a way to do it.
I agree.

One important question which Iíd still like to get answered (largely for the benefit of those more knowledgeable than I, who may read the thread) is exactly how you connect to the database. By which I mean, are you using (for example) VBís built-in data control, or ADO, DAO, ADODC, or something else entirely. I gather (from what Iíve seen in the Access forum) that there can be big differences between the methods in terms of efficiency.
Jan 15 '08 #7

Expert 5K+
P: 8,434
I can't believe I forgot about this.

One of the tweaks I had to do for my application was to switch the way I access a big recordset (admittedly updating, not reading) to avoid running out of RAM. That tweak (done on the advice of the Access gurus) reduced the RAM my VB6 program used, by around 1.2GB.

Personally, what I'd recommend is that I move this thread over to the Access forum, so that they can follow what has been discussed so far. That will prevent them having to ask a lot of the same questions.

What do you say? (Note, a pointer will be left in the VB forum, so people there will still see it and will be able to contribute.)
Jan 15 '08 #8

pureenhanoi
100+
P: 175
Hi all,
I am actually making a balance calculation program for my bank (of which I am an employee). The table from which I am fetching records contains atleast 50-60 million records with, ofcourse, properly indexed fields. The program starts summing up the amount one by one appropriating to credit or debit.
I dont think you can display 50-60 million records in one MSHFlexGrid at a time. The screen is limited, and peple dont want to scroll too much with a scroll bar. So, why dont you display only 100-200 records, and place two buttons : "Next 100" and "Previous 100". When user click one of these two button, you process only 100 record next or previous.
That technique called: "Provide on demand". You process only record(s) that user want display. Dont care about rest.
If there is another application connect to database and it need processed records (which havent processed by your program yet), so you can create "Trigger" for database (i've never used Oracle yet, so i dont know does it support "trigger" - but MS-SQL supported well).
Hope God provides "Trigger" for Oracle, he he ^^!!
Jan 15 '08 #9

P: 11
Try this. Instead of showing result after each record, show the result after say 200 or 500 records. I think this will reduce the time of processing and may also take less memory.
Jan 15 '08 #10

Expert 5K+
P: 8,434
I dont think you can display 50-60 million records in one MSHFlexGrid ...
That's something I completely overlooked. If you are loading 300,000 records into a grid control, that could easily explain the overflowing memory.

I really doubt you will want to process 100 at a time and wait on the user before continuing - you obviously want the process to continue through to the end. But regardless of how you handle the data processing, you may need to implement something along these sort of lines for the display.

I would definitely recommend looking into this.
Jan 15 '08 #11

P: 2
hello everybody,,i'm from china ,,nice to me you here ..
<E-mail address removed by Moderator>
Jan 16 '08 #12

Expert 5K+
P: 8,434
Hi ridiculous. Nice to meet you.

I've removed your e-mail address from the prior post, as site guidelines prohibit the posting of personal contact info (your own, or anyone else's). This is for you own protection, as scammers and spammers often scan the messages in forums like this to find victims.
Jan 16 '08 #13

Expert 5K+
P: 8,434
Just wondering, has there been any progress on this problem?
Jan 17 '08 #14

pureenhanoi
100+
P: 175
That's something I completely overlooked. If you are loading 300,000 records into a grid control, that could easily explain the overflowing memory.

I really doubt you will want to process 100 at a time and wait on the user before continuing - you obviously want the process to continue through to the end. But regardless of how you handle the data processing, you may need to implement something along these sort of lines for the display.

I would definitely recommend looking into this.
Ofcourse we need some varriables to save infomation about current view.

Let take a look in our forums. One Topic start, and have many Replies for it. Admin can control how many Replies are displayed on one page, and what page is current page of view.
So, i think we need only two varriables to processing data: recordsOnOnePage and currentPage
First, select All records into Recordset. Then, calculate the record start of current page: recordStart = recordsOnOnePate * currentPage
And then, simply: Move the cursor of recordset to recordStart and use a loop to display.
The big problem is i dont know how many records can be stored in one recordset. 50-60 million are huge numbers. If all records cant be stored in one record, so we cant select all at one
As i said, i've never used Oracle before, then i dont know Syntax of SQL in Oracle.
So, if Oracle support "SELECT TOP N ,,,", and it has Autonumber (or Auto Increment) field, our problem can be solve easier.
Jan 17 '08 #15

daniel aristidou
100+
P: 491
I tend to find that msflexigrid vb6 version is rather difficult and slow to use.,,,,,,,,
Why not try using listbox?
Jan 17 '08 #16

daniel aristidou
100+
P: 491
I tend to find that msflexigrid vb6 version is rather difficult and slow to use.,,,,,,,,
Why not try using listbox?
Sorry meant to say mshflexgrid
Jan 17 '08 #17

QVeen72
Expert 100+
P: 1,445
Why not try using listbox?
What if he wants to display 20 columns...?

Regards
Veena
Jan 17 '08 #18

daniel aristidou
100+
P: 491
What if he wants to display 20 columns...?
See your point.....

General Question....Why is vb6 still being used for large databases when ..newer versions offer so much more.....freedom and control?

New versions of vb have built in code converters to update your code from old coding versions to new coding versions.
So if its a case of converting it being a very long task...i wouldnt think it would take that long....also it will probably solve alot of problems that can be experience in older coding formats.
I suppose it might be time consuming to learn the new syntax though.

Daniel(~_~)
Jan 17 '08 #19

Expert 5K+
P: 8,434
That's a large part of my problem. No time to learn the new setup and syntax. In VB6, at least I "know my way around".

Of course, I'm using it pretty much as a hobby, so it's a fairly minor issue.

Also, I think that most things can be done in VB6, just as well as (if not better than) in later versions. MS tend to add on lots of bells and whistles that don't make a real difference. I'll bet the later versions execute slower.
Jan 17 '08 #20

QVeen72
Expert 100+
P: 1,445
General Question....Why is vb6 still being used for large databases when ..newer versions offer so much more.....freedom and control?
as MS itself says, If you are starting a new project, start in VB.net, but upgrading the older one may not be a very good option..

New versions of vb have built in code converters to update your code from old coding versions to new coding versions.
Yeah, I once tried to convert our live Project.. with some 100 forms, it took 3 hours and throwed up at least 1000 errors. Printing a text on form has different approach all together(likewise many are ther).. all that code was not converted...
And if an Older version is running quite OK, I would not want to take the pain of upgrading it.. Say if you have to update it in a LAN of 100 Client nodes..

It is better to start on Latest Version for new Projects..

Regards
Veena
Jan 18 '08 #21

daniel aristidou
100+
P: 491
I'll bet the later versions execute slower.
It obviously depends on what you want to do....... but i would take that bet..
Also
MS tend to add on lots of bells and whistles that don't make a real difference.
I find that alot of the new "Bells and Whistles" reduce alot of work/code needed to be written, therefore i would have thought that this would reduce execute time due to less code needed to do a task.

Daniel(~_~)
Jan 18 '08 #22

QVeen72
Expert 100+
P: 1,445
I find that alot of the new "Bells and Whistles" reduce alot of work/code needed to be written, therefore i would have thought that this would reduce execute time due to less code needed to do a task.
Daniel(~_~)
Not always the code is reduced..
I tried this..
Expand|Select|Wrap|Line Numbers
  1. Me.Cls
  2. Me.Print "Code In Vb6"
  3. Me.Print "Just 3 Lines of Code"
  4.  
I tried this in vb.net and this is what I end with...:
Expand|Select|Wrap|Line Numbers
  1. Dim MyCanvas As Graphics = Me.CreateGraphics()
  2. Dim myFont As Font = New Font("CourierNew", 12)
  3. Dim LineHeight As Integer = myFont.Height
  4. Dim LineNo As Integer
  5. LineNo = 0
  6. MyCanvas.DrawString("Welcome To VB.Net Graphics", myFont, Brushes.AliceBlue, 1, (LineNo * LineHeight))
  7. LineNo += 1
  8. MyCanvas.DrawString("This Is a REAL TEDIOUS PROCESS ", myFont, Brushes.AliceBlue, 1, (LineNo * LineHeight))
  9.  
Regards
Veena
Jan 19 '08 #23

daniel aristidou
100+
P: 491
Yes printing is alot more code, however you can do alot more.
Jan 19 '08 #24

Expert 5K+
P: 8,434
I find that alot of the new "Bells and Whistles" reduce alot of work/code needed to be written, therefore i would have thought that this would reduce execute time due to less code needed to do a task.
I have always found that in general, it works the other way. As code libraries and so on become more and more bloated with things to make life easier for the programmer, things tend to take longer to load and longer to execute.

In fact, finding that handy new function that saves you coding a complex piece of code everywhere can make things slightly slower - because you may perform exactly the same code as before, but you introduce an extra function call to do it. If you are performing the function enough millions of times, the difference can become significant.

Of course this general tendency won't apply in all situations. And it doesn't mean we should all be using VB1. But there are sometimes cases where an older and simpler tool may be better, because it's simpler.
Jan 21 '08 #25

P: 46
Just wondering, has there been any progress on this problem?
Hi Killer,

Sorry for the delay in reply. Actually I was on leave for seven days.

I have got the solution to my problem. The solution was to make my recordset as ForwardOnly. That way, it didn't bother to store the scanned records in memory and my problem was that only, excessive use of RAM.

Now, I am able to scan about 20-30 million records at once. Although, my requirement was upto 50-60 million, but I was somehow able to convince my client that this is the limit under the given circumstances.

Clarification
I would like to tell you and other fellows that I never dared to put 50-60 million records in an MSHFlexGrid. All I wanted was to put a summarized position in the grid which consists no more than 15 rows. It was something like this:

[HTML]Product Balance Number of entries
eRemit 41210152536.01 78459321
eATM 871475865.28 459863251
............ ................... ...................
............ ................... ...................
Maestro Card 4789652123.01 458763218[/HTML]


I think, I had made this quite clear in my very first post while asking the question. I don't know why this confusion arose.

Anyways, I am happy that my problem is solved now.
Thanks for all your support and suggestions.

Keep Programming
Jan 21 '08 #26

Expert 5K+
P: 8,434
I have got the solution to my problem. The solution was to make my recordset as ForwardOnly. That way, it didn't bother to store the scanned records in memory and my problem was that only, excessive use of RAM.
This will certainly be worth keeping in mind. I for one have always wondered what forward-only recordsets were good for. Thanks for the info.

...I think, I had made this quite clear in my very first post while asking the question. I don't know why this confusion arose.
That's easy. You made the fatal mistake of assuming we would read it. :D (And understand it.)

Seriously though, when dealing with people via text like this, it's very easy for misunderstandings to creep in. You just can't beat face-to-face communication.

Anyways, I am happy that my problem is solved now.
Excellent! Glad to hear it.


P.S. Man, I wish that was my balance...
Jan 21 '08 #27

P: 46
Hi Killer,

Agreed that I cann't beat face-to-face communication, but a careful reading of my question will easily reveal you what my intentions were. Its not that complex as it is being projected.

P.S. Man, I wish that was my balance...
But I certainly do not wish that it was you balance. Because it is a debit balance meaning that my bank has lent that much amount to people and is required to recover that. Still, if you want that to be your balance, I cann't help it. My bank's job will get easier.

Never mind, jokes apart.
Jan 23 '08 #28

Expert 5K+
P: 8,434
Agreed that I cann't beat face-to-face communication, but a careful reading of my question will easily reveal you what my intentions were. Its not that complex as it is being projected.
Oops!

But I certainly do not wish that it was you balance. Because it is a debit balance ...
Oops!

:)
Jan 24 '08 #29

Post your reply

Sign in to post your reply or Sign up for a free account.