473,396 Members | 1,879 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,396 software developers and data experts.

Web page access

I have a home page with Verizon. Within this home page I have several
sub-directories with their own associated pages. One of these subdirectories
has personal contact information and I wish to restrict access to these
pages to everyone except a few individuals. Is there a way to add a password
to a subdirectory so that anyone trying to access these pages will have to
input the correct password in order to view them?

Thanks.
Oct 30 '05 #1
9 1676
In article <uAb9f.3665$zT6.1312@trnddc06>,
"Victor & Toni Jo Friedmann" <ge***@verizon.net> wrote:
I have a home page with Verizon. Within this home page I have several
sub-directories with their own associated pages. One of these subdirectories
has personal contact information and I wish to restrict access to these
pages to everyone except a few individuals. Is there a way to add a password
to a subdirectory so that anyone trying to access these pages will have to
input the correct password in order to view them?

Thanks.


That depends on the web server Verizon is using and the configuration
they're running. I know with Comcast, when I ask such questions, I was
told to go to a web hosting company. Comcast doesn't support such
things nor can I run CGI scripts or php or mysql.

Ask Verizon support if they have .htpass files enabled or you could just
create one and see if it has any affect, but it requires knowledge of
the Apache web server. Don't be surprised if it doesn't work or you
told you're out of luck.

--
DeeDee, don't press that button! DeeDee! NO! Dee...

Oct 31 '05 #2
__/ [Michael Vilain] on Monday 31 October 2005 00:34 \__
In article <uAb9f.3665$zT6.1312@trnddc06>,
"Victor & Toni Jo Friedmann" <ge***@verizon.net> wrote:
I have a home page with Verizon. Within this home page I have several
sub-directories with their own associated pages. One of these
subdirectories has personal contact information and I wish to restrict
access to these pages to everyone except a few individuals. Is there a way
to add a password to a subdirectory so that anyone trying to access these
pages will have to input the correct password in order to view them?

Thanks.


That depends on the web server Verizon is using and the configuration
they're running. I know with Comcast, when I ask such questions, I was
told to go to a web hosting company. Comcast doesn't support such
things nor can I run CGI scripts or php or mysql.

Ask Verizon support if they have .htpass files enabled or you could just
create one and see if it has any affect, but it requires knowledge of
the Apache web server. Don't be surprised if it doesn't work or you
told you're out of luck.


By wishing to password-restrict content, you begin to ask for a fully-fea-
tured hosting service that comes with the risk of collpase (=support=cost)
and abuse.

Some time ago, before I had access that was beyond FTP (i.e. file manage-
ment), I used the following trick.

http://schestowitz.com/res.htm

Press "research workspace" (now crossed out). A window will pop up, re-
quiring you to enter a password, which is in fact the missing segment of
the Web address. This will not avoid spyware like Alexa/A9/Amazon toolbars
(among more) from crawling your password-protected pages, but it will at
least turn away human users who ought to remain outside. To understand how
this works (essentially JavaScript), look at the source, change it and
save it. It's known as JavaScript Gatekeeper if I recall correctly.

Hope it helps,

Roy
Oct 31 '05 #3
Roy Schestowitz wrote:
This will not avoid spyware like Alexa/A9/Amazon toolbars (among
more) from crawling your password-protected pages, but it will at
least turn away human users who ought to remain outside. To
understand how this works (essentially JavaScript), look at the
source, change it and save it.


Anyone with a clue can turn off JavaScript support in their browser.
Security should not depend on clueless attackers.
Nov 1 '05 #4
__/ [Leif K-Brooks] on Tuesday 01 November 2005 01:16 \__
Roy Schestowitz wrote:
This will not avoid spyware like Alexa/A9/Amazon toolbars (among
more) from crawling your password-protected pages, but it will at
least turn away human users who ought to remain outside. To
understand how this works (essentially JavaScript), look at the
source, change it and save it.


Anyone with a clue can turn off JavaScript support in their browser.
Security should not depend on clueless attackers.


No JavaScript, no entry. *smile*

....still better than ActiveX

ActiveX enabled, anybody in (including hijackers)

Roy
Nov 1 '05 #5
In our last episode, Leif K-Brooks <eu*****@ecritters.biz> pronounced to
comp.infosystems.www.authoring.html:
Roy Schestowitz wrote:
To
understand how this works (essentially JavaScript), look at the
source, change it and save it.


Anyone with a clue can turn off JavaScript support in their browser.


Or look at the source and deduce the address themselves.

--
Mark Parnell
http://clarkecomputers.com.au
Nov 1 '05 #6
Roy Schestowitz wrote:
Press "research workspace" (now crossed out). A window will pop up, re-
quiring you to enter a password, which is in fact the missing segment of
the Web address.


Wow. What a complicated and nasty (it ran afoul of my popup blocker) way of
implementing "The content is at URL X, I haven't got any links to it, so
add it to your bookmarks."

--
David Dorward <http://blog.dorward.me.uk/> <http://dorward.me.uk/>
Home is where the ~/.bashrc is
Nov 1 '05 #7
In comp.infosystems.www.authoring.html on Tuesday 01 November 2005 05:56,
Roy Schestowitz wrote:
__/ [Leif K-Brooks] on Tuesday 01 November 2005 01:16 \__
Roy Schestowitz wrote:
This will not avoid spyware like Alexa/A9/Amazon toolbars (among
more) from crawling your password-protected pages, but it will at
least turn away human users who ought to remain outside. To
understand how this works (essentially JavaScript), look at the
source, change it and save it.


Anyone with a clue can turn off JavaScript support in their browser.
Security should not depend on clueless attackers.


No JavaScript, no entry. *smile*

...still better than ActiveX

ActiveX enabled, anybody in (including hijackers)

Roy


There is a flaw with this method, which is that if someone is visiting your
"private" page and then goes to a completely different website, the private
URL will be passed as the referrer to that website. Many websites' referrer
logs are publicly available (this may or may not be with the
intention/knowledge of the webmaster) and therefore, potentially, the links
could be accessed by search engines, so your private content could appear
in a search engine's results.

A partial solution, which I recommend you use, is to put the following in
the head section of each private page.

<meta name="ROBOTS" content="NOINDEX,NOFOLLOW,NOARCHIVE">

This only works with some search engines (but the major ones should all act
on it).

The preferred method of controlling search engine spiders is to use a
robots.txt file but this will have two drawbacks:

1. You might not have access to the root directory of the domain or
subdomain, which is where the robots.txt needs to go.
2. In any event, some people look at a site's robots.txt to "discover"
directories the site owner would rather weren't known about, hence it is
definitely *not* recommended for your situation.
Nov 4 '05 #8
__/ [Andrew] on Friday 04 November 2005 16:49 \__
In comp.infosystems.www.authoring.html on Tuesday 01 November 2005 05:56,
Roy Schestowitz wrote:
__/ [Leif K-Brooks] on Tuesday 01 November 2005 01:16 \__
Roy Schestowitz wrote:
This will not avoid spyware like Alexa/A9/Amazon toolbars (among
more) from crawling your password-protected pages, but it will at
least turn away human users who ought to remain outside. To
understand how this works (essentially JavaScript), look at the
source, change it and save it.

Anyone with a clue can turn off JavaScript support in their browser.
Security should not depend on clueless attackers.
No JavaScript, no entry. *smile*

...still better than ActiveX

ActiveX enabled, anybody in (including hijackers)

Roy


There is a flaw with this method, which is that if someone is visiting your
"private" page and then goes to a completely different website, the private
URL will be passed as the referrer to that website. Many websites' referrer
logs are publicly available (this may or may not be with the
intention/knowledge of the webmaster) and therefore, potentially, the links
could be accessed by search engines, so your private content could appear
in a search engine's results.

I never thought about this route. Thanks for pointing that out.

A partial solution, which I recommend you use, is to put the following in
the head section of each private page.

<meta name="ROBOTS" content="NOINDEX,NOFOLLOW,NOARCHIVE">

If the page contains sensitive content, I suppose 'shielding' it would indeed
be worthwhile. I would only like to stress that the information which I
'hide' is not confidential, yet it should never be easily-accessible.
Private material like Palm data has always been password-protected.

This only works with some search engines (but the major ones should all act
on it).

The preferred method of controlling search engine spiders is to use a
robots.txt file but this will have two drawbacks:

1. You might not have access to the root directory of the domain or
subdomain, which is where the robots.txt needs to go.
2. In any event, some people look at a site's robots.txt to "discover"
directories the site owner would rather weren't known about, hence it is
definitely *not* recommended for your situation.

Yes, I once thought about it. Pages and sections where I deny crawlers access
at robots.txt-level are either:

- Sections that contains names, which I would rather people did not 'Google'
(or 'Yahooed' etc.)

- Sections that are too extensive to be crawled as they will add 'noise' to
indices of the search engines.

Roy

--
Roy S. Schestowitz | Useless fact: A dragonfly only lives for one day
http://Schestowitz.com | SuSE Linux | PGP-Key: 0x74572E8E
5:15am up 2 days 1:13, 4 users, load average: 0.25, 0.46, 0.42
http://iuron.com - next generation of search paradigms
Nov 5 '05 #9
__/ [David Dorward] on Tuesday 01 November 2005 08:28 \__
Roy Schestowitz wrote:
Press "research workspace" (now crossed out). A window will pop up, re-
quiring you to enter a password, which is in fact the missing segment of
the Web address.


Wow. What a complicated and nasty (it ran afoul of my popup blocker) way of
implementing "The content is at URL X, I haven't got any links to it, so
add it to your bookmarks."


It's quite old and I was not very good at Web development at the time. By the
way, my pop-up blocker denies it as well. That page is a mess altogether and
I am fully aware of it. *smile*

Roy
Nov 5 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

7
by: guy | last post by:
I have an htm page which is often visted and it's a very established page and for that reason I cannot change its extension to asp. However, I want to run a bit of VB code on the server whenever...
6
by: Biguana | last post by:
Hi, Just wondering if anyone can help here. We have a site where most of the site opens in a window with no toolbars or menubar: window.open('mypage.aspx','self','toolbar=0, menubar=0, etc,...
2
by: James Doran | last post by:
Hello, I'd like to iterate through each Page of my ASP.NET project from within a Custom web control and access the Page.Controls collection. I've tried using Reflection on the web project...
6
by: scottyman | last post by:
I can't make this script work properly. I've gone as far as I can with it and the rest is out of my ability. I can do some html editing but I'm lost in the Java world. The script at the bottom of...
8
by: jbonifacejr | last post by:
This is my first day here, so please be patient. I do not know how to search very well so the search I tried to get the answer showed me topics from the year 2000 and they really don't cover what I...
4
by: evantay | last post by:
I'm using ASP.NET 2.0 with VS.NET 2005. I'm trying to access properties from my master pages within a page that inherits from that master page (a child page). However the values are always null....
4
by: lichaoir | last post by:
I have two questions about the use of Master Page: 1. How to access properties and controls in the master page from content page? (I know I can use the "FindControl()" method of the "Master"...
1
by: zlf | last post by:
Hello my website has role based access, and user without specific role assigned cannot access specific page in website. When the user try to access a page who has not been granted to access, the...
6
by: =?Utf-8?B?SmF5IFBvbmR5?= | last post by:
I am trying to access a Public property on a Master Page from a Base Page. On the content pages I have the MasterType Directive set up as follows: <%@ MasterType virtualpath="~/Master.master" %>...
7
by: Andy B | last post by:
I have a class I am creating for data access. I need to access controls from inside the class that are on a particular page. How do I do this? or is creating an instance of the page class and using...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.