By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,278 Members | 1,357 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,278 IT Pros & Developers. It's quick & easy.

Language file, Best practice

P: n/a


I would like to add international support for my site to allow some of
my users to translate the site if they really feel the urge to do it.

I don't use any CMS, the site was developed, (and refined :)), by me.
But I see that various CMSs handle languages very differently.
Joomla! for example has a smallish define file that has all the
Wordress on the other hand seems to have a function __(...) that is
altogether a lot more complicated but also more flexible.

But I am not sure that the Joomela! approach is the best, defining what
could become a rather big language file seems a bit silly to me, (well
in my case it seems to be). And there does not seem to be any naming
convention of all those defines, that could spell<sictrouble.

So, what is the best way, so far, of handling languages?


'webmaster forum' ( | 'webmaster Directory'
( | 'Recreation Vehicle insurance'
'Free URL redirection service' (
FFMG's Profile:
View this thread:

Message Posted via the webmaster forum, (Ad revenue sharing).

Apr 23 '07 #1
Share this Question
Share on Google+
2 Replies

P: n/a
On Apr 23, 2:05 am, FFMG <FFMG.2ph...@no-mx.httppoint.comwrote:

I would like to add international support for my site to allow some of
my users to translate the site if they really feel the urge to do it.

So, what is the best way, so far, of handling languages?
I think phpBB and similar use files for language for portability and
ease of updating. I would save smaller strings (like messages, link
text, page titles) into a database and retrieve the needed language on
each page. I think SQL caching should be effective here because each
language should use the same query every time.

If you think it would be faster, you can save it in both the database
and into files (into a PHP file, with array assignments), and each
time it needs to be updated the form is filled with data from the
database, and then when the form is submitted the data would be
updated in the database and the language file would be regenerated
with that new data. Then the files that use the language would include
the appropriate language file instead of querying the database each
time (useful if the table gets big and the queries start taking too
long). I've benchmarked with 100 rows of data and 100 files, and found
that retrieving an indexed row from the database is about twice as
fast as reading the file named with that index. I've been told that
the database would be even faster in comparison if there were more
files and rows.

-Mike PII

Apr 28 '07 #2

P: n/a
I did some more tests (I am working on a multi-lingual site also so
this is useful to me too), and it looks like reading and unserializing
a serialized language array from a file is much quicker than reading
and unserializing from a database (on my server environment). The
first two times are using PHP to create a PHP file and then
include()ing it

[create file] =0.0070760250091553
[direct read] =0.0096750259399414
[write serial] =0.0053510665893555
[read serial] =0.00060105323791504
[insert db serialized] =0.054249048233032
[get db serialized] =0.037890911102295

Here's the (PHP5) test that you can try on your server:
$db = new mysqli( 'dbhost', 'dbuser', 'dbpass' );
$db->query( 'USE `dbtable`' );

$i = 0;
$lang = array();

define( 'DATAFILE', 'benchmarking.php' );

//create file
$data = '';
$times = array( 'create file' =microtime( true ) );
$data = "<?php\n\$lang=array('asdfasdfasdf'";
for( $i = 20000; $i < 20500; ++$i )
$data .= ",'asdfewqrhbeqstn{$i}asdfasdfasd{$i}fasdf'";
$data .= ');?>';
file_put_contents( DATAFILE, $data );
$times['create file'] = microtime( true ) - $times['create file'];

unset( $data );

//direct read
$times['direct read'] = microtime( true );
require( 'benchmarking.php' );
$times['direct read'] = microtime( true ) - $times['direct read'];

//write serialized
$times['write serial'] = microtime( true );
file_put_contents( DATAFILE, serialize( $lang ) );
$times['write serial'] = microtime( true ) - $times['write serial'];

$lang = array();

//read serialized
$times['read serial'] = microtime( true );
$lang = unserialize( file_get_contents( DATAFILE ) );
$times['read serial'] = microtime( true ) - $times['read serial'];

unlink( DATAFILE );

$data = '';

//db serialized
$times['insert db serialized'] = microtime( true );
$data = serialize( $lang );
$sql_query = 'INSERT INTO `bm2` ( `data` ) VALUES ('
."'$data' )";
$db->query( $sql_query )
or trigger_error( $db->error, E_USER_ERROR );
$times['insert db serialized'] = microtime( true ) - $times['insert db

$data = '';
$lang = array();

//get db serialized
$times['get db serialized'] = microtime( true );
if( !( $sql = $db->query( "SELECT `data` FROM `bm2` WHERE `id` = $db-
>insert_id" ) ) )
trigger_error( $db->error, E_USER_ERROR );
list( $data ) = $sql->fetch_row();
$lang = unserialize( $data );
$times['get db serialized'] = microtime( true ) - $times['get db

print_r( $times );

The table looks like:
`bm2` (collation is utf-8)
`id` PRIMARY INT NOT NULL auto_increment
`data` TEXT

Apr 28 '07 #3

This discussion thread is closed

Replies have been disabled for this discussion.