I can never find any answers when it comes to xml and databases, so I did some testing on my own. I'll save other people the trouble and post my results.
Summary
Mysql is much faster then using xml files (duh), even with a 100 separate files you really notice the slowdown. By putting each product in it's own xml file speeds things up by about 3x, but it is still too slow to replace a database.
Disclaimer
These tests are not perfect, but the results differ so much it's easy to come to a conclusion.
Why run the tests?
Because no one seems to want to discuss or talk about how xml actually can be used to replace a database like mysql, but they advocate it. Xml has its uses and I am finding new ones all the time for websites, but using it to replace a mysql database for something as small as 100 records is not a good idea. Try using it for site navigation, it works good for that. I'm also looking into latest new, top ten, stuff like that.
Test 1. (Test to see if having one xml file per product is viable)
700 separate xml files in one directory. Each file is opened and searched using simplexml with an xpath expression.
The php script executes in 0-1 seconds which doesn't really tell us much, so I ran the apache ab benchmark. Here are the results:
command line: ab -n x -c y http://localhost/xmlTest.php
where x and y are some numbers
below are values for n and c that I used in the command
-
n c longest request (ms) time (seconds)
-
-
5 1 718 NA
-
5 2 1375 NA
-
5 3 2031 NA
-
10 1 703 6.7 seconds
-
1000 1 NA NA
-
Test 2. (Test to see if having 700 products in one xml file is viable) -
n c longest request (ms) time (seconds)
-
5 1 218 0.921875
-
5 2 406 1.15625
-
5 3 687 1.140625
-
10 1 359 2.21875
-
1000 1 406 194.45315
-
Test 3. (Two Mysql tables with referential intergrity enforced - 735 records) -
n c longest request (ms) time (seconds)
-
5 1 15 0.625
-
5 2 31 0.625
-
5 3 31 0.46875
-
10 1 15 0.109375
-
1000 1 31 10.9
-
1000 5 78 11
-