473,396 Members | 1,764 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,396 software developers and data experts.

getting absolute directory path?

Hello all -

I have a two part question.

First of all, I have a website under /home/user/www/. The index.php
and all the other website pages are under /home/user/www/. For
functions that are used in multiple files, I have php files under /
home/user/www/functions/. These files simply have

So, in index.php and other files, I have
<?php
include("./functions/function.create_html_page.inc");
include("./functions/function.create_pdf.inc");
.....
?>

And then function files, such as function.create_html_page.inc
<?php
function create_html_page( $arg1, $arg2 ) {
...
}
?>

So, my php pages have relative references to the function files, i.e.
"./functions/".

This works okay, but when I want to include a function in a function
file, I have to change the relative path. So, instead of
include("./functions/function.create_pdf.inc");
I should have
include("./function.create_pdf.inc");

Ideally I would like to figure out the absolute path name, so that I
don't have to change the relative path names depending on whether the
file is in the root directory or the functions directory, or any other
directory for that matter.

In other words, I could do
<?php
$path = "/home/user/www/";
include( $path . "functions/function.create_html_page.inc");
include( $path . "functions/function.create_pdf.inc");
....
?>

And then the include statements are identical, no matter what
directory they're in.

But, I would like the solution to be more portable. In change I change
hosts, or install the website on another server, I would like the path
name not to be hard-coded, so I wouldn't have to change anything.
Something like

<?
$path = get_base_directory();
include( $path . "functions/function.create_html_page.inc");
include( $path . "functions/function.create_pdf.inc");
....
?>

My website is designed so that pages in subdirectories are never
served; the user is always navigating in the root directory. So, even
if a function in the functions subdirectory are called, the original
executing script was in the root.

I looked at functions like basename, but they expect the directory as
an argument.

I looked at $_SERVER and $_ENV, and they have 'OLDPWD'
[OLDPWD] =/home/user/www
[PWD] =/home/user/www/functions

OLDPWD looks like it would work, but I couldn't find it documented
anywhere, so I don't know if it would be useable as a portable
solution. If it's not documented, I probably can't rely on it being on
most systems, right?

Anywho, my thought now is to go through the backtrace array to figure
out the originating script, and thus the base directory. Is there an
easier way to do this?

Jun 2 '08 #1
13 2778
la*****@gmail.com wrote:
Hello all -

I have a two part question.

First of all, I have a website under /home/user/www/. The index.php
and all the other website pages are under /home/user/www/. For
functions that are used in multiple files, I have php files under /
home/user/www/functions/. These files simply have

So, in index.php and other files, I have
<?php
include("./functions/function.create_html_page.inc");
include("./functions/function.create_pdf.inc");
.....
?>

And then function files, such as function.create_html_page.inc
<?php
function create_html_page( $arg1, $arg2 ) {
...
}
?>

So, my php pages have relative references to the function files, i.e.
"./functions/".

This works okay, but when I want to include a function in a function
file, I have to change the relative path. So, instead of
include("./functions/function.create_pdf.inc");
I should have
include("./function.create_pdf.inc");

Ideally I would like to figure out the absolute path name, so that I
don't have to change the relative path names depending on whether the
file is in the root directory or the functions directory, or any other
directory for that matter.

In other words, I could do
<?php
$path = "/home/user/www/";
include( $path . "functions/function.create_html_page.inc");
include( $path . "functions/function.create_pdf.inc");
....
?>

And then the include statements are identical, no matter what
directory they're in.

But, I would like the solution to be more portable. In change I change
hosts, or install the website on another server, I would like the path
name not to be hard-coded, so I wouldn't have to change anything.
Something like

<?
$path = get_base_directory();
include( $path . "functions/function.create_html_page.inc");
include( $path . "functions/function.create_pdf.inc");
....
?>

My website is designed so that pages in subdirectories are never
served; the user is always navigating in the root directory. So, even
if a function in the functions subdirectory are called, the original
executing script was in the root.

I looked at functions like basename, but they expect the directory as
an argument.

I looked at $_SERVER and $_ENV, and they have 'OLDPWD'
[OLDPWD] =/home/user/www
[PWD] =/home/user/www/functions

OLDPWD looks like it would work, but I couldn't find it documented
anywhere, so I don't know if it would be useable as a portable
solution. If it's not documented, I probably can't rely on it being on
most systems, right?

Anywho, my thought now is to go through the backtrace array to figure
out the originating script, and thus the base directory. Is there an
easier way to do this?
Either use $_SERVER['DOCUMENT_ROOT'], or use the include_path setting,
and don't provide the directory at all on an include.

--
Rik Wasmus
Jun 2 '08 #2
la*****@gmail.com a écrit :
This works okay, but when I want to include a function in a function
file, I have to change the relative path. So, instead of
include("./functions/function.create_pdf.inc");
I should have
include("./function.create_pdf.inc");
In a function file:
include (dirname(__FILE__) . './function.create_pdf.inc");

I would also recommand to use .inc.php so your php code cannot be
displayed in a browser, only executed.

Regards,
--
Guillaume
Jun 2 '08 #3
On 24 Apr, 16:32, lawp...@gmail.com wrote:
>
First of all, I have a website under /home/user/www/. The index.php
and all the other website pages are under /home/user/www/. For
functions that are used in multiple files, I have php files under /
home/user/www/functions/. These files simply have

So, in index.php and other files, I have
<?php
include("./functions/function.create_html_page.inc");
include("./functions/function.create_pdf.inc");
....
?>

And then function files, such as function.create_html_page.inc
<?php
function create_html_page( $arg1, $arg2 ) {
...}

?>

So, my php pages have relative references to the function files, i.e.
"./functions/".

This works okay, but when I want to include a function in a function
file, I have to change the relative path. So, instead of
include("./functions/function.create_pdf.inc");
I should have
include("./function.create_pdf.inc");

Ideally I would like to figure out the absolute path name, so that I
don't have to change the relative path names depending on whether the
file is in the root directory or the functions directory, or any other
directory for that matter.
No.

Put all your include files in a directory referenced by the ini
include_path (or a specific sub-directory thereof). Then its a no
brainer to always reference the right file regardless of where it is
included from.
My website is designed so that pages in subdirectories are never
served; the user is always navigating in the root directory. So, even
if a function in the functions subdirectory are called, the original
executing script was in the root.
Setting yourself such a constraint is a very bad idea - its
ridiculously restrictive and encourages bad habits (dumping too many
files into one folder).

C.
Jun 2 '08 #4
On Apr 25, 8:18 am, "C. (http://symcbean.blogspot.com/)"
<colin.mckin...@gmail.comwrote:
Put all your include files in a directory referenced by the ini
include_path (or a specific sub-directory thereof). Then its a no
brainer to always reference the right file regardless of where it is
included from.
Good idea.
My website is designed so that pages in subdirectories are never
served; the user is always navigating in the root directory. So, even
if a function in the functions subdirectory are called, the original
executing script was in the root.

Setting yourself such a constraint is a very bad idea - its
ridiculously restrictive and encourages bad habits (dumping too many
files into one folder).
I rather like this idea. I know where everything is right away, and I
don't have to traverse directories looking for a file. It's not a
'restriction' any more than keeping your papers on your desk well-
organized is a 'restriction' -- technically it is, but you're doing it
because it gains you some benefit.

I've done it the other way before -- crafting a well-thought out tree
and placing files in their 'proper' directory -- but what if a page
'belongs' in two directories? client_sales_review.php might belong
both under /clients and /sales. Where do you put it then? Wherever you
put it, you're going to drive yourself nuts trying to find it in the
other directory, in its rightful 'place'. Okay, should I make a
symlink? What about making a new directory, or reorganizing the
clients and sales folder? Hey, instead of jimmying with the directory
tree, how about getting back to work on the pages?

I now use a naming convention that gets me all the usability of the
directory tree, without the hassles of getting the right tree
structure.

sales_view.php
sales_list.php
sales_new.php
sales_edit.php
client_sales_review.php
client_list.php
client_new.php
client_view.php
client_edit.php

If I simply type 'ls *sales*', I get all the sales person files,
including client_sales_review.php . If I type 'ls *client*', I get all
the client files, along with client_sales.php. Now here's the kicker:
if I want to see all editing pages, I simply type "ls *edit*" . I
would get both sales_edit.php and client_edit.php. If I had each
*_edit.php page in a different subdirectory ( such as /clients and /
sales ), I would have to descend into each subdirectory do anything
with them. A major PITA, IMHO. In essence, the substrings in the file
name act as virtual subdirectories. I have to be as disciplined with
filenames as I would a subdirectory hierarchy, but I feel that I get
more benefit maintaining filenames in a single folder than maintaining
a proper directory tree.

What should be the upper limit for number of files in a folder,
anyway? The max number that can be displayed on a screen? Wouldn't you
run into that problem in any directory, regardless of if there were
subdirectories? I think that a directory tree should be organized by
the abstract concept that relates the files residing in the same
directory. Any other way, and it's just an arbitrary number that
drives what location a file resides in. Suppose my screen is bigger
than another developer's -- I would put more files in a directory than
he/she might. Is that the only "bad habit" that this practice
encourages?
Jun 2 '08 #5
On Mon, 28 Apr 2008 09:15:05 -0700 (PDT), la*****@gmail.com wrote:
If I simply type 'ls *sales*', I get all the sales person files,
including client_sales_review.php . If I type 'ls *client*', I get all
the client files, along with client_sales.php. Now here's the kicker:
if I want to see all editing pages, I simply type "ls *edit*" . I
would get both sales_edit.php and client_edit.php. If I had each
*_edit.php page in a different subdirectory ( such as /clients and /
sales ), I would have to descend into each subdirectory do anything
with them. A major PITA, IMHO. In essence, the substrings in the file
name act as virtual subdirectories. I have to be as disciplined with
filenames as I would a subdirectory hierarchy, but I feel that I get
more benefit maintaining filenames in a single folder than maintaining
a proper directory tree.

What should be the upper limit for number of files in a folder,
anyway? The max number that can be displayed on a screen? Wouldn't you
run into that problem in any directory, regardless of if there were
subdirectories? I think that a directory tree should be organized by
the abstract concept that relates the files residing in the same
directory. Any other way, and it's just an arbitrary number that
drives what location a file resides in. Suppose my screen is bigger
than another developer's -- I would put more files in a directory than
he/she might. Is that the only "bad habit" that this practice
encourages?
Generally, you'll run out of patience trying to maintain large numbers
of files before you'll run out of room in a single directory. Some
utilities may have trouble dealing with large numbers of files at once,
so operations of every file in a directory may fail. Even so, you're
talking about thousands of files as a maximum for even the crummiest
utility, not dozens.

--
97. My dungeon cells will not be furnished with objects that contain reflective
surfaces or anything that can be unravelled.
--Peter Anspach's list of things to do as an Evil Overlord
Jun 2 '08 #6
On Apr 28, 1:25 pm, "Peter H. Coffin" <hell...@ninehells.comwrote:
Generally, you'll run out of patience trying to maintain large numbers
of files before you'll run out of room in a single directory.
What would be an example scenario that would try my patience? If it's
editing anything more than 3 files, I'm using command line tools such
as grep, find, bash scripts, or command line perl, rather than doing
it by hand. What would I be able to do to 10 files that would give me
a problem at 10,000? I'll wait 10 seconds instead of less than 1 for
my command to finish when I have 10,000 files.

I run out of patience trying to delve through a crazy directory tree
that's more than three directories wide or deep. There's no easy way
to tell a command line script to ignore certain subdirectories and
apply a function on others. At least, none that I know of.

If it ever gets to the point where some site is more than 10,000
individual php files, well, I'd like to see someone develop that.
Chances are, there is tremendous redundancy in that site, and the
number of files could be greatly reduced. However, if you really do
need 10,000 php pages for this application, LAMP or WAMP is probably
not the proper tool for the job.
Some utilities may have trouble dealing with large numbers of files at once,
so operations of every file in a directory may fail.
On a unix system? I doubt it.

No utility works on a large number of files "at once". It goes through
each file individually, one at a time.
Jun 2 '08 #7
On Mon, 28 Apr 2008 18:42:52 -0700 (PDT), la*****@gmail.com wrote:
On Apr 28, 1:25 pm, "Peter H. Coffin" <hell...@ninehells.comwrote:
>Generally, you'll run out of patience trying to maintain large numbers
of files before you'll run out of room in a single directory.

What would be an example scenario that would try my patience? If it's
editing anything more than 3 files, I'm using command line tools such
as grep, find, bash scripts, or command line perl, rather than doing
it by hand. What would I be able to do to 10 files that would give me
a problem at 10,000? I'll wait 10 seconds instead of less than 1 for
my command to finish when I have 10,000 files.
Shell filename globbing frequently expands into a list of filenames that
are stored in the command buffer, which generally has SOME kind of a
cap. Mostly I've seen 32k, but that'll vary. If your list of filenames
expands out past the cap, you'll get an "Arguement list too long" error
and nothing will happen. find(1) works around that, but it allows
operations on files individually, not as aggregate.
I run out of patience trying to delve through a crazy directory tree
that's more than three directories wide or deep. There's no easy way
to tell a command line script to ignore certain subdirectories and
apply a function on others. At least, none that I know of.

If it ever gets to the point where some site is more than 10,000
individual php files, well, I'd like to see someone develop that.
Chances are, there is tremendous redundancy in that site, and the
number of files could be greatly reduced. However, if you really do
need 10,000 php pages for this application, LAMP or WAMP is probably
not the proper tool for the job.
>Some utilities may have trouble dealing with large numbers of files at once,
so operations of every file in a directory may fail.

On a unix system? I doubt it.
See above about "Arguement list too long".
No utility works on a large number of files "at once". It goes through
each file individually, one at a time.
What's the term for a statement that's technically true but irrelevant?

--
We're the technical experts. We were hired so that management could
ignore our recommendations and tell us how to do our jobs.
-- Mike Andrews
Jun 2 '08 #8
On Apr 29, 9:32 am, "Peter H. Coffin" <hell...@ninehells.comwrote:
Shell filename globbing frequently expands into a list of filenames that
are stored in the command buffer, which generally has SOME kind of a
cap. Mostly I've seen 32k, but that'll vary. If your list of filenames
expands out past the cap, you'll get an "Arguement list too long" error
and nothing will happen. find(1) works around that, but it allows
operations on files individually, not as aggregate.
OK, this is starting to make sense as a difficulty in management.

What, then, would be an appropriate directory structure? Start putting
files that being with 'a' under the a/ directory? How do you handle
linking in the website then?

If too many files for the command buffer is what's preventing me from
using a wildcard , then I could do the same results by doing something
like '$command a*; command b*;' etc.

Or, do I create directories and organize them thematically, according
to the functions of the website, such as 'new_customer_signup'? What
happens when that directory gets too many files in it?

It seems to me, that if the website is has grown so much that there
are too many files in the root directory, so much so that you can't
properly run commands on it, that a website is not the right solution
for the problem. There's a flaw in the design somewhere, and the
number of files in the root directory is a symptom of it.

I'm willing to see the other side of this, but so far I can't think of
an instance where a file management task I'm doing on the command line
is going to be more cumbersome within a single directory than across
and into multiple subdirectories. Is wildcarding the only problem?
Jun 2 '08 #9
On Apr 29, 10:14 am, Guillaume <ggra...@NOSPAM.gmail.com.INVALID>
wrote:
My website would work through it... If I have to parse file, or backup
them all, I could be in trouble.

It usually is a good idea to create sub directories based on the first
letters, the date or any other relevant data.
I'm far from an expert, but I've seen this in systems like
mailservers, where user's mailboxes are stored under something like /u/
s/username, seemingly expanding out each letter when more space
becomes necessary.

But in the case of a website, how would you properly handle links
within pages in that case?
Jun 2 '08 #10
On Tue, 29 Apr 2008 19:02:19 -0700 (PDT), la*****@gmail.com wrote:
On Apr 29, 9:32 am, "Peter H. Coffin" <hell...@ninehells.comwrote:
>Shell filename globbing frequently expands into a list of filenames that
are stored in the command buffer, which generally has SOME kind of a
cap. Mostly I've seen 32k, but that'll vary. If your list of filenames
expands out past the cap, you'll get an "Arguement list too long" error
and nothing will happen. find(1) works around that, but it allows
operations on files individually, not as aggregate.

OK, this is starting to make sense as a difficulty in management.

What, then, would be an appropriate directory structure? Start putting
files that being with 'a' under the a/ directory? How do you handle
linking in the website then?
That's a common way to arrange Very Large Collections of files. And it
extends nicely horizontally (aa/ ab/ ac/ etc) as well as vertically
(a/a/ a/b/ a/c/). As far as site linking goes, you're generally dealing
with automated (by php or whatever) management of files by the point you
get to more than a few hundred anyway, so you can just automate the
references as well...

$catalog_image_thumb_path = substr($catalog_item_id,1,1) .
"/" .
$catalog_item_id;
print "<td><a href="http://example.com/show_item?id=$item_id'><img src='$catalog_image_thumb_path'></a></td>";
If too many files for the command buffer is what's preventing me from
using a wildcard , then I could do the same results by doing something
like '$command a*; command b*;' etc.
Sometimes. It depends on what you're trying to do. If you're trying to
do an operation to each file, you're okay. If you're trying to (for
example) get a count of the total number of lines that refer to a
particular included module in all the .c files in a massive directory
but not in the .php files, that may be a bit more of a challenge.
Or, do I create directories and organize them thematically, according
to the functions of the website, such as 'new_customer_signup'? What
happens when that directory gets too many files in it?

It seems to me, that if the website is has grown so much that there
are too many files in the root directory, so much so that you can't
properly run commands on it, that a website is not the right solution
for the problem. There's a flaw in the design somewhere, and the
number of files in the root directory is a symptom of it.
Often it's a problem. Sometimes it's a matter of growth outstripping the
time available to rework a functioning site.
I'm willing to see the other side of this, but so far I can't think of
an instance where a file management task I'm doing on the command line
is going to be more cumbersome within a single directory than across
and into multiple subdirectories. Is wildcarding the only problem?
*grin* It's probably not, but it's the one I've run into more than once,
and it's the one that was primative enough to be a recurring problem
rather than a "solve it once by applying more technology" situtation.

--
It's not hard, it's just asking for a visit by the fuckup fairy.
-- Peter da Silva
Jun 2 '08 #11
On 30 Apr, 14:34, "Peter H. Coffin" <hell...@ninehells.comwrote:
On Tue, 29 Apr 2008 19:02:19 -0700 (PDT), lawp...@gmail.com wrote:
On Apr 29, 9:32 am, "Peter H. Coffin" <hell...@ninehells.comwrote:
Shell filename globbing frequently expands into a list of filenames that
are stored in the command buffer, which generally has SOME kind of a
cap. Mostly I've seen 32k, but that'll vary. If your list of filenames
expands out past the cap, you'll get an "Arguement list too long" error
and nothing will happen. find(1) works around that, but it allows
operations on files individually, not as aggregate.
OK, this is starting to make sense as a difficulty in management.
What, then, would be an appropriate directory structure? Start putting
files that being with 'a' under the a/ directory? How do you handle
linking in the website then?

That's a common way to arrange Very Large Collections of files. And it
extends nicely horizontally (aa/ ab/ ac/ etc) as well as vertically
(a/a/ a/b/ a/c/). As far as site linking goes, you're generally dealing
with automated (by php or whatever) management of files by the point you
get to more than a few hundred anyway, so you can just automate the
references as well...

$catalog_image_thumb_path = substr($catalog_item_id,1,1) .
"/" .
$catalog_item_id;
print "<td><a href="http://example.com/show_item?id=$item_id'><img src='$catalog_image_thumb_path'></a></td>";
If too many files for the command buffer is what's preventing me from
using a wildcard , then I could do the same results by doing something
like '$command a*; command b*;' etc.

Sometimes. It depends on what you're trying to do. If you're trying to
do an operation to each file, you're okay. If you're trying to (for
example) get a count of the total number of lines that refer to a
particular included module in all the .c files in a massive directory
but not in the .php files, that may be a bit more of a challenge.
Or, do I create directories and organize them thematically, according
to the functions of the website, such as 'new_customer_signup'? What
happens when that directory gets too many files in it?
It seems to me, that if the website is has grown so much that there
are too many files in the root directory, so much so that you can't
properly run commands on it, that a website is not the right solution
for the problem. There's a flaw in the design somewhere, and the
number of files in the root directory is a symptom of it.

Often it's a problem. Sometimes it's a matter of growth outstripping the
time available to rework a functioning site.
I'm willing to see the other side of this, but so far I can't think of
an instance where a file management task I'm doing on the command line
is going to be more cumbersome within a single directory than across
and into multiple subdirectories. Is wildcarding the only problem?

*grin* It's probably not, but it's the one I've run into more than once,
and it's the one that was primative enough to be a recurring problem
rather than a "solve it once by applying more technology" situtation.

--
It's not hard, it's just asking for a visit by the fuckup fairy.
-- Peter da Silva
I've often seen shared web servers where the individual sites are
placed in structures based on their first 2 letters:
>a
>>a
>>>aardvark_site
b
>>>abstract_site
b
>>ba
>>>barker_site
and so on.
Jun 2 '08 #12
On Apr 30, 9:34 am, "Peter H. Coffin" <hell...@ninehells.comwrote:

>
If too many files for the command buffer is what's preventing me from
using a wildcard , then I could do the same results by doing something
like '$command a*; command b*;' etc.

Sometimes. It depends on what you're trying to do. If you're trying to
do an operation to each file, you're okay. If you're trying to (for
example) get a count of the total number of lines that refer to a
particular included module in all the .c files in a massive directory
but not in the .php files, that may be a bit more of a challenge.
grep -c "search_string" *.c

What is more of a challenge to me, however, is doing a function
recursively into certain subdirectories while ignoring others.
>
Or, do I create directories and organize them thematically, according
to the functions of the website, such as 'new_customer_signup'? What
happens when that directory gets too many files in it?
It seems to me, that if the website is has grown so much that there
are too many files in the root directory, so much so that you can't
properly run commands on it, that a website is not the right solution
for the problem. There's a flaw in the design somewhere, and the
number of files in the root directory is a symptom of it.

Often it's a problem. Sometimes it's a matter of growth outstripping the
time available to rework a functioning site.
It makes sense in the case of images, when you might have thousands or
tens of thousands. But for php files?

Can a website really grow so much that you outstrip the amount of
files that a directory can hold? Presumably all of these php files are
written by hand; if not, there's some incredible redundancy -- you
just don't need that many php files. ( I can see that, however, for
something like images or thumbnails, etc. ) . If you really do need
that amount of code, I would bet that a website/PHP application is the
wrong solution for the task.

I mean, if the problem is more than 10,000 files in a directory, who's
going to write all those files?
>
I'm willing to see the other side of this, but so far I can't think of
an instance where a file management task I'm doing on the command line
is going to be more cumbersome within a single directory than across
and into multiple subdirectories. Is wildcarding the only problem?

*grin* It's probably not, but it's the one I've run into more than once,
and it's the one that was primative enough to be a recurring problem
rather than a "solve it once by applying more technology" situtation.
What was the wildcarding problem you had?The size of the command
buffer to expand all of the filenames in * ? Is it something that
cannot be solved by in turn doing command a*, command b* ?

I think I have a good solution; others are cautioning me against it,
but so far, the problems they say I might encounter don't seem all
that problematic.
Jun 2 '08 #13
On Wed, 30 Apr 2008 13:43:35 -0700 (PDT), la*****@gmail.com wrote:
On Apr 30, 9:34 am, "Peter H. Coffin" <hell...@ninehells.comwrote:

>>
If too many files for the command buffer is what's preventing me from
using a wildcard , then I could do the same results by doing something
like '$command a*; command b*;' etc.

Sometimes. It depends on what you're trying to do. If you're trying to
do an operation to each file, you're okay. If you're trying to (for
example) get a count of the total number of lines that refer to a
particular included module in all the .c files in a massive directory
but not in the .php files, that may be a bit more of a challenge.

grep -c "search_string" *.c

What is more of a challenge to me, however, is doing a function
recursively into certain subdirectories while ignoring others.
"Arguement list too long" Ptui. Remember, the challenge is that we've
got too many files to fit within the shell's filename expansion. That's
why it's a *problem*.
Or, do I create directories and organize them thematically, according
to the functions of the website, such as 'new_customer_signup'? What
happens when that directory gets too many files in it?
It seems to me, that if the website is has grown so much that there
are too many files in the root directory, so much so that you can't
properly run commands on it, that a website is not the right solution
for the problem. There's a flaw in the design somewhere, and the
number of files in the root directory is a symptom of it.

Often it's a problem. Sometimes it's a matter of growth outstripping the
time available to rework a functioning site.

It makes sense in the case of images, when you might have thousands or
tens of thousands. But for php files?

Can a website really grow so much that you outstrip the amount of
files that a directory can hold? Presumably all of these php files are
written by hand; if not, there's some incredible redundancy -- you
just don't need that many php files. ( I can see that, however, for
something like images or thumbnails, etc. ) . If you really do need
that amount of code, I would bet that a website/PHP application is the
wrong solution for the task.

I mean, if the problem is more than 10,000 files in a directory, who's
going to write all those files?
I've seen a lot of dumb websites.... And don't forget that php can build
php programs that it can then call...
I'm willing to see the other side of this, but so far I can't think of
an instance where a file management task I'm doing on the command line
is going to be more cumbersome within a single directory than across
and into multiple subdirectories. Is wildcarding the only problem?

*grin* It's probably not, but it's the one I've run into more than once,
and it's the one that was primative enough to be a recurring problem
rather than a "solve it once by applying more technology" situtation.

What was the wildcarding problem you had?The size of the command
buffer to expand all of the filenames in * ? Is it something that
cannot be solved by in turn doing command a*, command b* ?
Eventually, yes, you can subdivide the namespace far enough down to
something that will fit in a command buffer. But you're then effectively
mirroring exactly the same problem you're trying to avoid by avoiding
building a tree of directories. It becomes something that can't be done
with a *simple* command, and you end up needing at least on-the-fly
shell programming with some for-loops and ranges, dogs and cats living
together, anarchy!
I think I have a good solution; others are cautioning me against it,
but so far, the problems they say I might encounter don't seem all
that problematic.
--
"Doesn't everybody?" is a question that never expects an answer of "No."
Jun 2 '08 #14

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: Peter Taurins | last post by:
Hi there. I have an included file (header.php) that contanis a reference to a graphic. If I stay at the root level, then I can control the relative path of the image. eg. images/imagename.jpg ...
9
by: Stuart | last post by:
Hi All, I got a challenge to make the same APS/Script/Html run on different web roots. I can not use relative pathing in a lot of cases. We use lots of included files so depending on where that...
15
by: Nick K. | last post by:
I recently began maintenance work on a production web server that is located in the root directory of a web server. I moved this into a sub web on my local web server in order to do work on it. I...
6
by: john_williams_800 | last post by:
Hi; I am writing an html page that will live on one server in an ms windows network, but access pictures from a directory on another ms windows server in the network. I know in html the...
3
by: PJ6 | last post by:
Is it possible to get the absolute path (i.e. C:\inetpub\wwwroot) of the IIS root directory? And yes, I do need to get the actual, absolute path, because the Crystal Reports XI web viewer does -...
9
by: Daz | last post by:
Hi everyone. I am a little stumped at why when I try to include a file in same directory as the script being processed, it looks in the same directory as the script that included it to begin...
1
by: mikebian | last post by:
I created a db that houses info about pictures that were scanned into the PC. The pictures are stored on the file system. I have a form coded so that it shows thumbnails of the images, which are...
6
by: Jon Slaughter | last post by:
do I have to prefix every absolute path with document root to get it to work? For some reason I thought that prefixing a path with '/' or './' with make it absolute w.r.t to document root but I...
185
by: jacob navia | last post by:
Hi We are rewriting the libc for the 64 bit version of lcc-win and we have added a new field in the FILE structure: char *FileName; fopen() will save the file name and an accessor function will...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.