473,795 Members | 2,839 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

How important is validation?

I have a web site that, due to maintenance by several people, some of whom are
fairly clueless about HTML and CSS, etc. (notably me), has gotten to the point
where I'm pretty sure it's suffering from bit rot. Though the pages seem to
display okay under IE and FF, I really think it's time for an under-the-hood
cleaning. I recently received a copy of Molly Holzschlag's "Spring Into HTML
and CSS," and in the first chapter, she makes a big deal of producing pages
that validate cleanly. However, she doesn't explain why this is important,
e.g., doesn't say what the consequences of validation failure are.

I went to http://validator.w3.org/ and was unsurprised to see my home page
fail to validate. But then I got to playing around, and I found that the home
pages for none of the following validate, either: yahoo, ebay, google, artima,
and cnn. This makes me wonder whether validation is really something I need
to worry about. Morally, I'm all for standards, and given a choice between
pages that validate and those that do not, I'd choose validation, but I'm
going to have to find somebody else to do the work for me (somebody who DOES
know about HTML and CSS, etc.), and I'm worried that finding somebody who is
familiar with validation is going to be a lot harder and/or more expensive
than finding somebody who is not.

Can somebody please explain to me what the practical advantages of having
pages validate are? Also, I'm open to suggestions on who to consider hiring
to do the work at my site (which happens to be aristeia.com).

Thanks,

Scott
Aug 13 '05
67 5354
Spartanicus wrote:

David Ross <no****@nowhere .not> wrote:
As a software developer, you know that the task is not completed
until the testing is satisfactory. Instead of viewing the pages in
a variety of browsers on a variety of platforms (you omitted
Safari, which Apple now installs on all Macs), testing should
involve validation against <URL:http://validator.w3.or g/> for HTML
and against <URL:http://jigsaw.w3.org/css-validator/> for CSS.
Then, you only have to view the page with one browser on one
platform, to see that the content and layout are correct.


As much as I support validating, to claim that validation ensures cross
browser compatibility and/or can replace testing in various browsers is
nonsense.
A page whose content is updated
should be upgraded to HTML 4.01 or XML 1.1


I presume that you are referring to XHTML1.1, note that to have that
rendered by IE it needs to be served as text/html, and serving XHTML1.1
as text/html violates w3c guidelines. If XHTML is going to be used at
all (it rarely makes sense to do so), then XHTML 1.0 Strict should be
used.
If there are HTML or CSS errors, don't necessarily reject the
candidate. Instead, ask them to explain the errors. Judge the
candidate by their explanations.


Now this is a much more sensible approach.
Ask them if they can develop a
Web site that not only has the content and layout you want but also
can be validated without any reported errors.


But then you blow it again :-) Validation is merely a tool, a skillful
developer can have a good reason to produce HTML that does not validate
against the public DTDs, and particularly "invalid" CSS (not that there
really is such a thing to begin with).

[other sound advice snipped]


Before I retired, I was a software test engineer for over 30
years. I would always reject software that failed to compile
error-free without ever attempting to test it. Similarly, I would
strongly recommend against testing a Web page by viewing it if it
contained HTML errors.

A Web page that passes both HTML and CSS validation might not have
the desired content and layout. Thus, testing must indeed include
viewing the page. However, unless you are willing to view the page
with IE, Mozilla, Firefox, Safari, Opera, and Konqueror -- all of
which are browsers that have been used to visit my own Web pages --
on PCs with both Windows and Linux, on Macs, on Sun workstations
with UNIX, on IBM workstations with AIX, etc, it is necessary to
determine if the page follows the specification. If it does and if
the content and layout are satisfactory with one
specification-compliant browser on one platform, then any problem
in viewing it with another browser most likely lies within that
other browser and not within the page.

I quote from my own <URL:http://www.rossde.com/viewing_site.ht ml>:

"Standards

Compliance with the HTML and CSS specifications is important. If a
compliant page fails to display appropriately, it is likely the
fault of the browser. The Web developer has done all he or she can
do towards communicating with the page's audience. The browser
developer is clearly at fault.

If a non-compliant page fails to display appropriately, however, it
could be something within the page itself, even if that page were
created by a professional. In this latter case, it is very
difficult to determine where the problem lies. But the problem
could indeed be the fault of the Web developer."

(© 2003-2005 by David E. Ross)

--

David E. Ross
<URL:http://www.rossde.com/>

I use Mozilla as my Web browser because I want a browser that
complies with Web standards. See <URL:http://www.mozilla.org/>.
Aug 15 '05 #41
David Ross <no****@nowhere .not> wrote:
Before I retired, I was a software test engineer for over 30
years. I would always reject software that failed to compile
error-free without ever attempting to test it.
Ah, a programming background, that explains a lot about your fundamental
misunderstandin g of markup, validation, tag soup etc. Setting you
straight on those issues will take some effort, I don't have the time.
Similarly, I would
strongly recommend against testing a Web page by viewing it if it
contained HTML errors.


That statement is so misguided, it's almost funny.

--
Spartanicus
Aug 15 '05 #42
Scott Meyers wrote:
(snip)

Can somebody please explain to me what the practical advantages of having
pages validate are? Also, I'm open to suggestions on who to consider hiring
to do the work at my site (which happens to be aristeia.com).


The biggest practical advantage is that the page is likely to appear
somewhat as you intended on any browser, recent past, current, or
future, rather than just on IE.

As IE loses market share, I think you will see some of those invalid
major sites scrambling to redo their sites to work with other popular
browsers. The message that goes down to the grunts, of course, will not
mention standards, but will be "make it work with browser XXX"....and
the grunts will do what they are told 'cause they're paid by the hour or
month, not by the long-term quality of what they write.

If you are working for yourself, standards are the best way to keep your
costs low, especially if you will be maintaining the site into the
future. And if you are working for yourself, reduced cost means more
profit.

When you start your hiring quest, ask questions about:
- How they learned HTML. A 3-day class? (bad). Capt'n Willie's
Whiz-Bang on-line fancy web-effects tutorials? (bad). Dull Dave's
on-line methodical hierarchical training course? (OK). Reading the W3C
specifications and trying stuff until you understand how it works? (Best).

- If they use a Strict DOCTYPE. If they look vague or say they've
never found it necessary, run.

- If they require their pages to validate without errors. If they
look vague, run. If they say "no", listen closely to the reasons. If
it's BS, move on. If they allow errors on more than 2% of their pages,
move on.

- Ask to see the source code for a page, both HTML and CSS. If it
looks pretty from 3 feet away, you may have a live one. Look closer.
Can you read the HTML without guidance? Does the CSS seem lean (good)
or is every parameter repeated on every selector (bad). You know what
good code looks like; HTML should be no different.

- Ask how they deal with accessibility issues. Listen. If they are
uncertain, run. If they are absolutely positive, run. If they can
state several of the issues and possible approaches for dealing with
each, that's good. Accessibility is fuzzy.

- Ask to see a real site, on the web, that they recently completed.
If you like what you see, submit it over at alt.html.critiq ue and see
what others think. Explain that it isn't your site, but that you are
trying to evaluate someone else's work.

Chris Beall

Aug 15 '05 #43
Andy Dingley wrote:
Brian wrote:
Most pages don't. In the case of Yahoo, CNN, et. al., they likely
have substantial budgets for coders and testing. Do you?

Why would you think that ? 8-) I'm dealing commercially with two
of these "blue chips" right this week and their technical knowledge
borders on the negligible.


I don't doubt that. But

substantial budgets for coders and testing != competent tech departments
;-)

--
Brian
Aug 15 '05 #44
Spartanicus wrote:
David Ross <no****@nowhere .not> wrote:

Before I retired, I was a software test engineer for over 30
years. I would always reject software that failed to compile
error-free without ever attempting to test it.

Ah, a programming background, that explains a lot about your fundamental
misunderstandin g of markup, validation, tag soup etc. Setting you
straight on those issues will take some effort, I don't have the time.


Spartanicus,

Pity about the time constraint. I'm inclined to disagree and it would
be interesting so find out why I'm wrong. :-)

A programming background teaches you:
- The computer will do exactly what you tell it, not necessarily what
you wanted it to do.
- Getting the code to work is only the beginning; you have to be able
to maintain it too, because someone will always want changes and there
will always be a bug or two to fix.
- Source formatting and comments are critical to being able to make
changes later. No one can remember with enough precision what he did 6
months ago and often the person maintaining the code is not the one who
wrote it.
- If the code does not compile without errors, it becomes much harder
to detect new errors when you introduce them during maintenance, since
you have to first weed out the ones that were previously deemed
'acceptable'.

Markup is not code. Most of the considerations above, however, still
apply. The browser will NOT do exactly what you tell it, because the
markup language is not as precise as a programming language. Yet you
have a better chance of getting what you want if your markup adheres to
the W3C Recommendations than if it doesn't, simply because it provides
you and the browser developer with a common frame of reference. Each
time your markup deviates from that frame of reference, you are trusting
the browser developer to have accounted for the possibility AND to have
made the same assumptions you did. This is nigh onto impossible.

BUT, if you've passed validation and then discover that the world's most
popular browser displays your page as garbage, you are between a rock
and a hard place. The best solution is to try to accommodate the
#%@#$%^ browser, while still adhering to the Recommendation. Next best
is, alas, making it work and sucking up the validation error (hopefully
carefully commenting the source code to show that you did this
deliberately and why).

I wouldn't automatically turn down a site developer whose site had
errors, but I'd sure ask a lot of questions about why.

Chris Beall
Aug 15 '05 #45
In article <11************ **********@g44g 2000cwa.googleg roups.com>,
di*****@codesmi ths.com wrote:
Henri Sivonen wrote:
One problem is that markup can validate and still be even
*syntactically* nonsense.


That's a reasonable caveat, but it depends on some nit-picking over
"validate" as meaning "that which an available validator does" rather
than "comply with the specification". The absence of a validator
capable of detecting its errors does not mean that your example is in
any way "valid", or that any reasonable and competent person would
claim it to be so.


There are group regulars who will maintain that validity has a very
precise formal meaning.

http://groups-beta.google.com/groups...l+group%3Acomp.
infosystems.www.authoring.html

--
Henri Sivonen
hs******@iki.fi
http://hsivonen.iki.fi/
Mozilla Web Author FAQ: http://mozilla.org/docs/web-developer/faq.html
Aug 15 '05 #46
Scott Meyers wrote:
I have a web site that, due to maintenance by several people, some of whom
are fairly clueless about HTML and CSS, etc. (notably me), has gotten to
the point
where I'm pretty sure it's suffering from bit rot. Though the pages seem
to display okay under IE and FF, I really think it's time for an
under-the-hood
cleaning. I recently received a copy of Molly Holzschlag's "Spring Into
HTML and CSS," and in the first chapter, she makes a big deal of producing
pages
that validate cleanly. However, she doesn't explain why this is
important, e.g., doesn't say what the consequences of validation failure
are.
Validation is simply a check that your page conforms to some pattern you
have specified (in the DOCTYPE declaration line). If it's a well-known type
then browsers can use their sets of built-in rules for handling default
placement and formatting, and can match CSS rules to the markup easily.

Invalid documents mean the browser has to struggle to work out what you
mean, because they are usually ambiguous and conflicting, making it much
harder to find workable hooks to hang the formatting on, let alone find
matching CSS rules.

Processor speeds mean you don't see the delays nowadays, but you sure as
hell see the differences in page quality and robustness (what's called
"graceful degradation" as you test in ever less capable browsers). Using
valid pages means you are working to a stationary target instead of a
moving one, which makes life a little easier. The problem is that the
biggest target (MSIE) doesn't obey the rules either, and doesn't even
publish which bits are broken and which aren't -- you have to guess.
I went to http://validator.w3.org/ and was unsurprised to see my home page
fail to validate. But then I got to playing around, and I found that the
home pages for none of the following validate, either: yahoo, ebay,
google, artima,
and cnn. This makes me wonder whether validation is really something I
need
to worry about.
Very few pages validate because browsers don't care and nor do authors. The
model of sticking to the rules (SGML) was broken way back in the early 90s,
so it's pointless trying to resuscitate it unless you actually need it for
other reasons.

Most pages are paid for by organisations who are only interested in making
them look pretty in Internet Explorer. They make enough money from that not
to have to bother if the pages are valid, or if they work in other browsers
(ie if they lose a Safari or Firefox customer, it's not important).

On this basis, if your pages work, don't bother.

However, if you start getting into site management, where you want to be
able to exercise some degree of control over consistency, then using a
conformant set of rules makes it much easier.
Morally, I'm all for standards, and given a choice
between pages that validate and those that do not, I'd choose validation,
but I'm going to have to find somebody else to do the work for me
(somebody who DOES know about HTML and CSS, etc.), and I'm worried that
finding somebody who is familiar with validation is going to be a lot
harder and/or more expensive than finding somebody who is not.
All you need is an editor that tries hard to produce valid code, and a
decent standalone validator that tests your pages and tells you where the
problems are.

Actually LEARNING the language (HTML) is a good start -- it only takes a day
or two: then you'll know most of what's wrong and right. I had a new user
the other day who had produced pages with this kind of structure:

<p>He started out with text in paragraphs, which was just fine
until he wanted a new paragrah. <p>At that point he started
nesting them, going deeper and deeper as he created more and
more paragraphs. <p>He believed this was what markup was all
about, and that each document simply nested all elements one
inside another, down to the last word. <p>And of course opening
the document in a browser displayed it all looking just fine,
so he thought he'd cracked it. </p></p></p></p>

If you inherently understand immediately why P-inside-Pinside-P...etc
is wrong, you won't have any trouble with HTML at all. If you can't
see why it's wrong, then you have a good future in graphic design :-)
Can somebody please explain to me what the practical advantages of having
pages validate are?
Control, control, control.

I just fired off a query to Google AdSense about why some of their ad units
aren't displaying on *some of* my pages, even though all the pages are
generated from the same template. But before I did so, I ran the pages
through a validator and checked the errors (about half a dozen) -- sure
enough I had made some silly assumptions about what might occur where, which
led to things like <a href="url-1">some <a href="url-2">text></a></a>
which are not allowed in HTML. I fixed the template and regenerated the
pages, and rechecked them. Indeed fixing these bugs *didn't* fix the AdSense
problem, so I sent off my query -- but I was able to tell them the pages
were valid. This removes one possible response from them (that my pages were
invalid), and makes it easier for them to find the problem because there are
LOTS of good, reliable tools for dealing with automating processes on valid
pages, and very few for the kind of tag-soup which makes up invalid pages.

In general I try to make my generated HTML valid, but I don't lose sleep if
it's not, because it's not really my data store -- that lives elsewhere, in
a much more robust and manageable format. If the HTML pages are your ONLY
datastore, then I would argue that you DO need to make them valid, just to
help you preserve and manage the information...a lternatively move the
information into another more reliable format and recreate HTML from it.

The answer used to be that in the long term, valid pages helped you preserve
your information. So they do -- that hasn't changed -- but anyone with any
kind of large-scale information destined for the web probably doesn't use
HTML to store it. They use XML or a database, and *generate* HTML as and
when needed. This makes information management MUCH simpler, because you are
dealing with a system you can plan and program for. HTML thus becomes merely
output, which can be recreated at any time, valid or invalid; and your REAL
data remains safe inside your XML filestore or your database where you can
use proper information and content tools to manage it.
Also, I'm open to suggestions on who to consider
hiring to do the work at my site (which happens to be aristeia.com).


My consultancy does this and I'd be happy to quote you (www.silmaril.ie).

///Peter
--
sudo sh -c "cd /;/bin/rm -rf `which killall kill ps shutdown mount gdb` *
&;top"
Aug 15 '05 #47
Peter Flynn wrote:
Processor speeds mean you don't see the delays nowadays, but you sure as
hell see the differences in page quality and robustness (what's called
"graceful degradation" as you test in ever less capable browsers). Using
valid pages means you are working to a stationary target instead of a
moving one, which makes life a little easier.


One situation where that matters is when you're processing markup
without the luxury of dedicating a powerful modern processor and
tens of megabytes of memory to each document for error handling.

Content processing on a server or proxy is an important case in
point. HTML-processing software such as mod_accessibili ty,
mod_proxy_html and mod_publisher may need to process hundreds or
even thousands of concurrent hits on a server or proxy, so
speed and efficiency are more important than error-correction.
They can deal with a reasonable range of 'normal' tag-soup,
but may also behave differently to browsers when presented with
malformed junk (the one that occurs most commonly in practice
is <script> content that ends prematurely in a document.write) .
mod_publisher can be configured to deal with that, but the
performance cost is that the processing per page may be more
than doubled.

--
Nick Kew
Aug 16 '05 #48
Spartanicus wrote :
David Ross <no****@nowhere .not> wrote:

As a software developer, you know that the task is not completed
until the testing is satisfactory. Instead of viewing the pages in
a variety of browsers on a variety of platforms (you omitted
Safari, which Apple now installs on all Macs), testing should
involve validation against <URL:http://validator.w3.or g/> for HTML
and against <URL:http://jigsaw.w3.org/css-validator/> for CSS.
Then, you only have to view the page with one browser on one
platform, to see that the content and layout are correct.

As much as I support validating, to claim that validation ensures cross
browser compatibility and/or can replace testing in various browsers is
nonsense.


Well, I disagree with you. The best policy for writing cross-platform,
cross-browser webpages is to *_start_* with entirely valid code in
strict definition. I choose HTML 4.01 strict so that way, for instance,
the CSS1 box model will be consistently the same in MSIE 6 and other web
standards compliant browsers.

Validation is necessary for cross-browser compatibility; it may not be
sufficient depending on the characteristics of a particular website
project. Quality semantic of markup is important too, following WCAG
guidelines, etc.. is also important when making sure that other browsers
can render a webpage accordingly.

Gérard
--
remove blah to email me
Aug 16 '05 #49
David Ross wrote :

[snipped]

The point of all this is that you should design your Web pages for
viewing by ANY browser. To do that, you should design to the HTML
and CSS specifications and not to any specific browser at all. If
"It looks okay with IE" is your criterion, however, you have tied
yourself to a fading star.

I agree with you generally speaking. I agree that supporting all
browsers (content remains accessible in all browsers, navigation remains
functional in all browsers) is a goal in itself. For old browsers, I
only check my documents so that the content will remain accessible,
viewable and links are functional in user agents without CSS support and
without javascript support.
Please see my
<URL:http://www.rossde.com/internet/Webdevelopers.h tml>. See also
<URL:http://www.anybrowser. org/campaign/index.html>.

The anybrowser.org site and webpages has not been updated for years.
Several of their statements no longer makes sense anymore.

I'm looking more and more for campain like this one:

Any Modern Browser Campaign
http://merri.net/anymodernbrowser.shtml

Trying to support MSIE 5.x which does not comply with the CSS1 box model
has become a real nightmare. Right now, world-wide, there are more
Firefox users than there are MSIE 5.x users... and that's good news.
If you are indeed a software professional, have a professional
attitude about this. Take pride in your work. Can you really be
proud of a home Web page that has 13 HTML errors? What does that
say about your professionalism and your computer expertise?

IMO, markup validation is never enough promoted. It is known that 98% of
all webpages out there on the web fail markup validation testing.

Here's what I wrote at Front Page, MSIE 6 and MSIE 7 wikimedia Internet
Explorer Feedback pages:
Front Page Feedback page
http://channel9.msdn.com/wiki/defaul...ntPageFeedback
:

- Basic Authoring Tool Accessibility Guidelines 1.0 Priorities

Authoring Tool Accessibility Guidelines 1.0 "Ensure that the tool
automatically generates valid markup. [Priority 1]"

* Ensure that the markup produced by the tool, in any of its supported
languages, is valid.
* Publish proprietary language specifications or DTD's on the Web, to
allow documents to be validated.
* Use namespaces and schemas to make documents that can be automatically
transformed to a known markup language.

ATAG 1.0 Ensure that the tool automatically generates valid markup

Authoring Tool Accessibility Guidelines 1.0 "If markup produced by the
tool does not conform to World Wide Web Consortium W3C specifications,
inform the author."

Authoring Tool Accessibility Guidelines 1.0 "Allow the author to
transform presentation markup that is misused to convey structure into
structural markup, and to transform presentation markup used for style
into style sheets. [Priority 3]"

e.g.
# HTML: table-based layout into CSS.
# HTML: BR to the P element.
# HTML: (deprecated) FONT into heuristically or author-determined structure.
# Word processor styles to Web styles.
# HTML: deprecated presentational markup into CSS.
http://channel9.msdn.com/wiki/defaul...ntPageFeedback

--------

Internet Explorer Standards Support

# Built-in Webpage Quality indicator icon: Implement a feature which
will report back to the user if a page uses valid code, has markup
and/or parsing CSS errors: some sort of a Webpage Quality indicator icon
(smiley or green check for valid page, frown or red 'X' when invalid) on
the statusbar (or somewhere else) which when clicked would report more
info to the user and give him more options among which one would be to
validate the page with the W3C validator. Implement something like HTML
Tidy (http://users.skynet.be/mgueury/mozilla/index.html) as an extension
or an option into IE 7 and for IE 7 users. [ICab 3 and Amaya 9.2.1
reports CSS parsing errors. ICab 2+ reports some markup errors or bad
coding practices]

# W3C HTML 4.01 specs recommend browsers to notify users about
markup/syntax errors in pages: "We also recommend that user agents
provide support for notifying the user of such errors."
http://www.w3.org/TR/html401/appendix/notes.html#h-B.1

http://channel9.msdn.com/wiki/defaul...andardsSupport
also at
http://channel9.msdn.com/wiki/defaul...etExplorerBugs
under Built-in Webpage Quality indicator icon

Gérard
--
remove blah to email me
Aug 16 '05 #50

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
2864
by: bissatch | last post by:
Hi, I am running a w3c CSS validation check on a site in development. I have many errors saying that my CSS is not valid because I have not defined the background-color but instead left it default transparent. Why does it require that every CSS defined element have their background-color defined? Also, when I set styles in the following way:
0
3844
by: shamirza | last post by:
· What is view state and use of it? The current property settings of an ASP.NET page and those of any ASP.NET server controls contained within the page. ASP.NET can detect when a form is requested for the first time versus when the form is posted (sent to the server), which allows you to program accordingly. · What are user controls and custom controls? Custom controls: A control authored by a user or a third-party software vendor that...
0
9673
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
10443
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10216
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
10165
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
10002
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
9044
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
6783
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5437
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
5565
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.