Hallo,
I am working on multilingual web-application, and I have to be very sure
about how the international characters are encoded and decoded in the
client-server form requests.
There's a great article about the issue: http://ppewww.ph.gla.ac.uk/~flavell/...form-i18n.html
Generally, that states that this are is filled with landmines. From my tests
I see that form content upon POST request is encoded using the character
encoding from the html page that hosted the form. However, there is no
information about the used codepage in the POST request, and the server side
has somehow to guess it so that it can decode the data properly and populate
the Request.Form collection. My tests show that if the requester page is
plain html with utf-8 codepage Content-Type metatag, the serverside
sometimes does, but most time fails to decode the characters properly.
So, my question is, what codepage is used when interpreting and decoding the
POST request data anf Request.Form collection is populated? I cuold write my
own interpreter that takes the data out from Request.BinaryRead(), but I
would prefer to use the default Request.Form collection tough.
Thanks,
-- Pavils 4 10520
My sympathies. You may have noticed my posts on this question, and also the
lack of any response. Yes, that link has a super discussion of the issue.
The route I took was to end-run the problem by converting the input at POST
time to 7-bit-safe stuff, filled into a hidden field. In addition to a
database record of the input, I was trying to generate data for an RTF file
as a possible output, and while the database contents were handled correctly
in both directions, I could find nothing on its format for purposes of
converting to the "hex Unicode" Code-page format that RTF requires.
That is, a two-byte UTF-8 Cyrillic character was converted to a 4-byte
value, and I couldn't discern the conversion algorithm. I expect it's
related to a double conversion. A couple of cuts at reverse-engineering
failed. If you succeed, pls share the solution.
FYI, I used the Javascript charCodeAt() function for the client-side
conversion. HTH a bit more than just sympathy.
AS
Hi, Pavils
ASP uses ANSI code page to decode source data. You have to explicitly
specify utf-8 code page to work correctly with form data:
<%@ Codepage=65001 %>
BTW. I worked a lot of time to create component working with form-data,
any code page, accepting up to 2GB of multipart and url-encoded form data
(Request.Form has a 100kB limit). You can find it at http://www.pstruh.cz
(Huge-ASP upload)
Antonin
"Pavils Jurjans" <pa****@mailbox.riga.lv> wrote in message
news:#i**************@TK2MSFTNGP12.phx.gbl... Hallo,
I am working on multilingual web-application, and I have to be very sure about how the international characters are encoded and decoded in the client-server form requests.
There's a great article about the issue: http://ppewww.ph.gla.ac.uk/~flavell/...form-i18n.html
Generally, that states that this are is filled with landmines. From my
tests I see that form content upon POST request is encoded using the character encoding from the html page that hosted the form. However, there is no information about the used codepage in the POST request, and the server
side has somehow to guess it so that it can decode the data properly and
populate the Request.Form collection. My tests show that if the requester page is plain html with utf-8 codepage Content-Type metatag, the serverside sometimes does, but most time fails to decode the characters properly.
So, my question is, what codepage is used when interpreting and decoding
the POST request data anf Request.Form collection is populated? I cuold write
my own interpreter that takes the data out from Request.BinaryRead(), but I would prefer to use the default Request.Form collection tough.
Thanks,
-- Pavils
Thanks, Antonin,
This bit of info was the last one to complete my puzzle. Now I'm happy (tm).
Yes, I know of your site and the great components you have made. In this
case, I am looking for more tech insight on the POST format problems. I have
my own pure-ASP upload in JScript working very fine, and I prefer to stay
with pure code class, because then I have full control of what happens
inside.
Regards,
-- Pavils
"Antonin Foller" <an*****@foller.cz> wrote in message
news:ud*************@TK2MSFTNGP09.phx.gbl... Hi, Pavils
ASP uses ANSI code page to decode source data. You have to explicitly specify utf-8 code page to work correctly with form data: <%@ Codepage=65001 %>
BTW. I worked a lot of time to create component working with
form-data, any code page, accepting up to 2GB of multipart and url-encoded form data (Request.Form has a 100kB limit). You can find it at http://www.pstruh.cz (Huge-ASP upload)
Antonin
Arnold, I think I may help you with your issues. Just make you contactable,
mail me or ICQ: 4047612
-- Pavils
"Arnold Shore" <do**@bother.me> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl... My sympathies. You may have noticed my posts on this question, and also
the lack of any response. Yes, that link has a super discussion of the issue.
The route I took was to end-run the problem by converting the input at
POST time to 7-bit-safe stuff, filled into a hidden field. In addition to a database record of the input, I was trying to generate data for an RTF
file as a possible output, and while the database contents were handled
correctly in both directions, I could find nothing on its format for purposes of converting to the "hex Unicode" Code-page format that RTF requires.
That is, a two-byte UTF-8 Cyrillic character was converted to a 4-byte value, and I couldn't discern the conversion algorithm. I expect it's related to a double conversion. A couple of cuts at reverse-engineering failed. If you succeed, pls share the solution.
FYI, I used the Javascript charCodeAt() function for the client-side conversion. HTH a bit more than just sympathy.
AS
This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics
by: Tumurbaatar S. |
last post by:
When a browser sends (get or post) a form data containing
non-ASCII char-s (i.e. above 127), Request returns some char-s
encoded like '&#nnn'. It seems that when I used a standard
WindowsXP...
|
by: Art M |
last post by:
I saved an html page the other day that encoded some punctuation with codes
like
â?T --> apostrophe
(in case those characters don't show up in your news reader that's
a_circumflex + euro +...
|
by: webweaver_khries |
last post by:
here is a piece of form field and an asp action codes so the thing is
when i started working on this, at first it was working but after i
formatted my c drive of my computer. i started recieving...
|
by: Asha |
last post by:
greetings, attach below are javascripts, if you notice there is a keyword called escape which encodes the entire file and pass onto another fill to be decoded.
here are my implementation for...
|
by: steve mettraux |
last post by:
I have this kind of string,
dim myString as string = " =E0 ton essai d'=E9tablir une"
i know the kind of encoding, here I have iso-8859-1
dim charset as string ="iso-8859-1"
how to decode...
|
by: Prosper0 |
last post by:
Hello,
I have a problem sending special characters from a form to another page.
The sending page is in ISO-8859-1 (meta tag), and the recieving page too. (Using post)
The characters show OK in...
|
by: mistral |
last post by:
Can anyone help to identify what this encryption used in this script?
<html>
<body>
<script type="text/javascript" language="JavaScript">
function decrypt_p(x){
var l=x.length,
b=1024,...
|
by: Gandalf |
last post by:
now I understand my problem better so their is a good chance you
manage to help me.
I have a SQlite database full with ANSI Hebrew text , and program that
uses WXpython
Now, I use a-...
|
by: taylorcarr |
last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
|
by: Charles Arthur |
last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
|
by: ryjfgjl |
last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
|
by: ryjfgjl |
last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
|
by: emmanuelkatto |
last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud.
Please let me know.
Thanks!
Emmanuel
|
by: BarryA |
last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
|
by: Hystou |
last post by:
There are some requirements for setting up RAID:
1. The motherboard and BIOS support RAID configuration.
2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
|
by: Hystou |
last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
|
by: jinu1996 |
last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
| |