By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
454,723 Members | 1,414 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 454,723 IT Pros & Developers. It's quick & easy.

Best text search for many Gigabyte Database?

P: n/a
I have a so big DB, It's about 10-100GB of text.
Now I want to create a searching function but It take too long when I
use LIKE from MySQL
Anyone know the best sollution?
Can I use Google Desktop search or an others?
Please help!

Feb 5 '07 #1
Share this Question
Share on Google+
5 Replies


P: n/a
ngocviet wrote:
I have a so big DB, It's about 10-100GB of text.
Now I want to create a searching function but It take too long when I
use LIKE from MySQL
Anyone know the best sollution?
Can I use Google Desktop search or an others?
Please help!
If you use mySQL, consider using FULLTEXT and MATCH AGAINST().
Bottomline stays: If you have 100GB of data, some searchtime is needed, but
using FULLTEXT can speed that up considerable.
Check your mySQL documentation for details.

Regards,
Erwin Moller
Feb 5 '07 #2

P: n/a
If you use mySQL, consider using FULLTEXT and MATCH AGAINST().
Bottomline stays: If you have 100GB of data, some searchtime is needed, but
using FULLTEXT can speed that up considerable.
Check your mySQL documentation for details.
I've tried FULLTEXT but it take too long, about over 10 seconds with
1.2 Gigabyte table.
If need, I can convert database to other structure.

Feb 5 '07 #3

P: n/a
ngocviet wrote:
>
>If you use mySQL, consider using FULLTEXT and MATCH AGAINST().
Bottomline stays: If you have 100GB of data, some searchtime is needed,
but using FULLTEXT can speed that up considerable.
Check your mySQL documentation for details.

I've tried FULLTEXT but it take too long, about over 10 seconds with
1.2 Gigabyte table.
If need, I can convert database to other structure.
Hi,

If you use MySQL ISAM with FULLTEXT, and not LIKE, but MATCH AGAINST, you
are using one of the fastest approaches a developer can set up (as far as I
know).
I expect the only way to increase searchspeed for a 100GB database filled
with text, is throwing more/better hardware at it (more memory, faster disk
IO, faster CPU, etc.).
In general: When searching through a huge datastructure: Disk IO is the
bottleneck. So faster disk IO will help the most.
Dive into different RAID systems maybe.
eg: If you have 2 HD delivering data at the same time (using some RAID),
your query will maybe run twice as fast.

Regards,
Erwin Moller
Feb 5 '07 #4

P: n/a
"ngocviet" <ng******@gmail.comwrote in message
news:11*********************@h3g2000cwc.googlegrou ps.com...
>
>If you use mySQL, consider using FULLTEXT and MATCH AGAINST().
Bottomline stays: If you have 100GB of data, some searchtime is needed,
but
using FULLTEXT can speed that up considerable.
Check your mySQL documentation for details.

I've tried FULLTEXT but it take too long, about over 10 seconds with
1.2 Gigabyte table.
If need, I can convert database to other structure.

A search like that for such an amount of unindexed textual data simply has
to take some time. Live with it, or invest to hardware. Like the IT proverb
says: "If it doesn't work, just throw money at it".

--
"Ohjelmoija on organismi joka muuttaa kofeiinia koodiksi" - lpk
http://outolempi.net/ahdistus/ - Satunnaisesti päivittyvä nettisarjis
sp**@outolempi.net | rot13(xv***@bhgbyrzcv.arg)
Feb 5 '07 #5

P: n/a
In addition to MySQL FULLTEXT, you could also use Lucene
(Zend_Search_Lucene in PHP or perhaps Solr running inside Tomcat).
Performance is quite good and Lucene is very flexible.

On Feb 5, 1:27 am, "ngocviet" <ngocv...@gmail.comwrote:
I have a so big DB, It's about 10-100GB of text.
Now I want to create a searching function but It take too long when I
use LIKE from MySQL
Anyone know the best sollution?
Can I use Google Desktop search or an others?
Please help!

Feb 6 '07 #6

This discussion thread is closed

Replies have been disabled for this discussion.