471,896 Members | 2,461 Online

# show Hex Numbers

I want to display hex numbers in a <input type-"text:> how do I do it?
Jan 20 '06 #1
21 16141
News wrote on 20 jan 2006 in comp.lang.javascript:
I want to display hex numbers in a <input type-"text:> how do I do it?

You can just key them in from the keyboard.

3f ff a2 .....

Perhaps you better explain what you really want.

--
Evertjan.
The Netherlands.
Jan 20 '06 #2

News wrote:
I want to display hex numbers in a <input type-"text:> how do I do it?

JavaScript supports hexadecimal numeric literals in the form 0xFFFF.
var foo = 0xFFFF; // foo is now 65535 decimal

But internally it treats all numbers in decimal form. So you can easily
convert a hex string into decimal value:
<input type="text" name="v1" value="FFFF"

- but you don't have a native JavaScript function for the reverse
action (convert decimal integer into hex string).

There is an ocean of custom functions written for the latter. Here is
one:

var hD='0123456789ABCDEF';
function dec2hex(d) {
var h = hD.substr(d&15,1);
while (d>15) {
d>>=4;
h=hD.substr(d&15,1)+h;
}
return h;
}

Jan 20 '06 #3

News wrote:
I want to display hex numbers in a <input type-"text:> how do I do it?

Such an input displays strings, to set the input use
inputElement.value = string
If you have a number e.g.
var n = 255;
and want to get the hexadecimal representation then you can use
var string = n.toString(16);
or
var string = n.toString(16).toUpperCase();
so perhaps
inputElement.value = n.toString(16).toUpperCase();
is what you want.

--

Martin Honnen
http://JavaScript.FAQTs.com/
Jan 20 '06 #4
VK wrote on 20 jan 2006 in comp.lang.javascript:
JavaScript supports hexadecimal numeric literals in the form 0xFFFF.
var foo = 0xFFFF; // foo is now 65535 decimal

But internally it treats all numbers in decimal form.

No, binary.

Only the displaying of numbers converts to decimal.

--
Evertjan.
The Netherlands.
Jan 20 '06 #5

Martin Honnen wrote:
var string = n.toString(16).toUpperCase();

That is a great hint!
Thank you.

Jan 20 '06 #6
VK said the following on 1/20/2006 11:43 AM:

<snip>
But internally it treats all numbers in decimal form.

Do you ever stop and wonder why most of the people here have derogatory

Computers have no sense of "decimal form" internally, it only knows of
binary and any other base is computed when needed.

--
Randy
comp.lang.javascript FAQ - http://jibbering.com/faq & newsgroup weekly
Javascript Best Practices - http://www.JavascriptToolbox.com/bestpractices/
Jan 20 '06 #7
On Fri, 20 Jan 2006 12:04:55 -0500, Randy Webb
<Hi************@aol.com> wrote:
VK said the following on 1/20/2006 11:43 AM:

<snip>
But internally it treats all numbers in decimal form.

Do you ever stop and wonder why most of the people here have derogatory

Computers have no sense of "decimal form" internally, it only knows of
binary and any other base is computed when needed.

Well, Binary Coded Decimal was used once upon a time to encode decimal
numbers (use 4 bits per decimal digit, but only use the numeric values
0-9). A bit of a kludge, and still using binary coding for the digits,
but the arithmetic etc was all done in base 10. I gather it was
regarded as essential in the banking industry at one time - it avoided
rounding errors implicit in binary computation. Most of the old IBM
Fortran IV (G, H and H(ext)) and 66 compilers supported it.

I haven't seen it for a very long time, but it certainly existed!

Paul
Jan 20 '06 #8

"Evertjan." <ex**************@interxnl.net> wrote in message
news:Xn********************@194.109.133.242...
News wrote on 20 jan 2006 in comp.lang.javascript:
I want to display hex numbers in a <input type-"text:> how do I do it?

You can just key them in from the keyboard.

3f ff a2 .....

Perhaps you better explain what you really want.

I don't want your help thanks
Jan 20 '06 #9

"Martin Honnen" <ma*******@yahoo.de> wrote in message

News wrote:
I want to display hex numbers in a <input type-"text:> how do I do it?

Such an input displays strings, to set the input use
inputElement.value = string
If you have a number e.g.
var n = 255;
and want to get the hexadecimal representation then you can use
var string = n.toString(16);
or
var string = n.toString(16).toUpperCase();
so perhaps
inputElement.value = n.toString(16).toUpperCase();
is what you want.

--

Martin Honnen
http://JavaScript.FAQTs.com/

Wow thanks for this, I will use it and also try to learn what is happening
;-)
Jan 20 '06 #10

"VK" <sc**********@yahoo.com> wrote in message

News wrote:
I want to display hex numbers in a <input type-"text:> how do I do it?

JavaScript supports hexadecimal numeric literals in the form 0xFFFF.
var foo = 0xFFFF; // foo is now 65535 decimal

But internally it treats all numbers in decimal form. So you can easily
convert a hex string into decimal value:
<input type="text" name="v1" value="FFFF"

- but you don't have a native JavaScript function for the reverse
action (convert decimal integer into hex string).

There is an ocean of custom functions written for the latter. Here is
one:

var hD='0123456789ABCDEF';
function dec2hex(d) {
var h = hD.substr(d&15,1);
while (d>15) {
d>>=4;
h=hD.substr(d&15,1)+h;
}
return h;
}

Thanks for this help I have never tried to dynamically show hex numbers
before I have heaps to learn
Jan 20 '06 #11
Paul Cooper wrote on 20 jan 2006 in comp.lang.javascript:
Well, Binary Coded Decimal was used once upon a time to encode decimal
numbers (use 4 bits per decimal digit, but only use the numeric values
0-9). A bit of a kludge, and still using binary coding for the digits,
but the arithmetic etc was all done in base 10. I gather it was
regarded as essential in the banking industry at one time - it avoided
rounding errors implicit in binary computation. Most of the old IBM
Fortran IV (G, H and H(ext)) and 66 compilers supported it.

I haven't seen it for a very long time, but it certainly existed!

"Central Data Basic" for the Signetics 2650 microprocessor,
rumored to be coded by a William Gates in the early 1980's,

It safely ran my company's financials.

--
Evertjan.
The Netherlands.
Jan 20 '06 #12

Randy Webb wrote:
VK said the following on 1/20/2006 11:43 AM:

<snip>
But internally it treats all numbers in decimal form.

Do you ever stop and wonder why most of the people here have derogatory

Computers have no sense of "decimal form" internally, it only knows of
binary and any other base is computed when needed.

Do you ever stop and wonder why <alt.comp.lang.javascript> appeared and
why so many people are posting there? And a lot of them first seen at
comp.lang.javascript but never come back?

Maybe because too many people here have derogatory adjectives to
describe other people?

Or because everyone is eventually getting sick of public beatdown for
each missed fn dot in the post?

Or because some self-appointed experts here cannot go to bed w/o
prooving to a newcomer how stupid and ignorant (s)he is?

Or because some experts got crazy over ECMA and want everyone to use
Netscape 4.x tools for forever and until ECMAScript 4 will come. (It
will never come, at least not withing any acceptable period of time.
You did not get it yet? Or just pretending that you do not understand?)
....computers are using binary format for internal data. Great thank to
you for putting light on it for me. But I'm still not so stupid as it
may seem: I know (and I got it completely by myself!) that computers
are running on the electicity! ;-)

Jan 20 '06 #13
VK wrote on 20 jan 2006 in comp.lang.javascript:
...computers are using binary format for internal data. Great thank to
you for putting light on it for me. But I'm still not so stupid as it
may seem: I know
Here you are wrong, VK, [and I am not talking about any stupidness],
"internal data" is not the same as "internal number format".

As shown elsewhere in this thread, BCD is an internal decimal
representation and storage of numbers, that is quite capable of having
internal Math functions.

As long as we are talking integers, the difference is minimal,
but with fraction or floating point math, it is quite important.
(and I got it completely by myself!) that computers
are running on the electicity! ;-)

Some computers are running on light beams and I have even seen a working
model running on compressed air having real flip-flop-gates.

Mechanical computers (Babbage!) where quitwe sophisticated.

--
Evertjan.
The Netherlands.
Jan 20 '06 #14
VK wrote:
Do you ever stop and wonder why <alt.comp.lang.javascript> appeared and
why so many people are posting there? And a lot of them first seen at
comp.lang.javascript but never come back?
[...]

If you like it better there, then post there.
PointedEars
Jan 20 '06 #15

Evertjan. wrote:
VK wrote on 20 jan 2006 in comp.lang.javascript:
...computers are using binary format for internal data. Great thank to
you for putting light on it for me. But I'm still not so stupid as it
may seem: I know

Here you are wrong, VK, [and I am not talking about any stupidness],
"internal data" is not the same as "internal number format".

As shown elsewhere in this thread, BCD is an internal decimal
representation and storage of numbers, that is quite capable of having
internal Math functions.

As long as we are talking integers, the difference is minimal,
but with fraction or floating point math, it is quite important.

OK
(and I got it completely by myself!) that computers
are running on the electicity! ;-)

Some computers are running on light beams and I have even seen a working
model running on compressed air having real flip-flop-gates.

Mechanical computers (Babbage!) where quitwe sophisticated.

OK

Jan 20 '06 #16
JRS: In article <bl********************************@4ax.com>, dated
Fri, 20 Jan 2006 17:38:47 remote, seen in news:comp.lang.javascript,
Paul Cooper <pa*********@invalid.bas.ac.uk> posted :

Well, Binary Coded Decimal was used once upon a time to encode decimal
numbers (use 4 bits per decimal digit, but only use the numeric values
0-9). A bit of a kludge, and still using binary coding for the digits,
but the arithmetic etc was all done in base 10. I gather it was
regarded as essential in the banking industry at one time - it avoided
rounding errors implicit in binary computation. Most of the old IBM
Fortran IV (G, H and H(ext)) and 66 compilers supported it.

I haven't seen it for a very long time, but it certainly existed!

If you have a PC, you can find BCD in the RTC - the bit that keeps the
time when the power is off.

If you look for example in Delphi newsgroups, you can see how, with the
/savior faire/ of the average company-applications programmer, much
trouble is caused by the lack of true decimal arithmetic in languages.

Via sig line 3, LONGCALC will do decimal integer arithmetic up to about
+-10^65520, and Hex to +-16^65520. That's the 16-bit version; the
32-bit EXE, via sig line 2 IIRC, goes to exponent 99999999. Slowly.

--
Web <URL:http://www.merlyn.demon.co.uk/> - FAQqish topics, acronyms & links.
PAS EXE TXT ZIP via <URL:http://www.merlyn.demon.co.uk/programs/00index.htm>.
Do not Mail News to me. Before a reply, quote with ">" or "> " (SoRFC1036)
Jan 20 '06 #17
Paul Cooper wrote:
On Fri, 20 Jan 2006 12:04:55 -0500, Randy Webb
<Hi************@aol.com> wrote:
VK said the following on 1/20/2006 11:43 AM:

<snip>
But internally it treats all numbers in decimal form.

Do you ever stop and wonder why most of the people here have derogatory

Computers have no sense of "decimal form" internally, it only knows of
binary and any other base is computed when needed.

Well, Binary Coded Decimal was used once upon a time to encode decimal
numbers (use 4 bits per decimal digit, but only use the numeric values
0-9). A bit of a kludge, and still using binary coding for the digits,
but the arithmetic etc was all done in base 10. I gather it was
regarded as essential in the banking industry at one time - it avoided
rounding errors implicit in binary computation. Most of the old IBM
Fortran IV (G, H and H(ext)) and 66 compilers supported it.

No, it is /not/ supported by any IBM 360-zSeries Fortran compiler.

It is supported by the IBM 360-zSeries compilers for PL/I, COBOL, RPG,
and (mainly in order to deal with existing data files) by an ad-hoc
language extension to C. It /should/ be supported in Ada 95, but I don't
know for sure that it is.

Older computers used other means to achieve decimal:

Excess-3. Like BCD, but 0-9 are encoded as BCD 3-12 (it slightly
simplifies subtraction).

Three-of-five: five bits, three 1, two 0. (Only ten of the 32 possible
five-bit figures are available.)

Biquinary: seven bits. The first two are either 01 or 10; the remaining
five are 10000, 01000, 00100, 00010, or 00001.

Decimal calculations /are/ necessary in banking, etc. However, direct
decimal implementation is not the only way to do decimal calculations.
You can, for example, calculate everything in cents, or mills, using
binary integers. However, this can involve extra multiplications and
divisions by powers of ten to keep things in order. The most common Ada
compiler for Windows uses this method to simulate decimal.

Actually, the i86 architecture supports BCD in a fossilized form dating
back to Intel's original intention of putting the 4004 chip into desk
calculators. The instructions are only helper instructions that need to
be put into subroutines, and I don't know of any compiler that uses them
to implement decimal arithmetic, but they're there.

--
John W. Kennedy
"But now is a new thing which is very old--
that the rich make themselves richer and not poorer,
which is the true Gospel, for the poor's sake."
-- Charles Williams. "Judgement at Chelmsford"
Jan 21 '06 #18
On 2006-01-20, Paul Cooper <pa*********@invalid.bas.ac.uk> wrote:
Well, Binary Coded Decimal was used once upon a time to encode decimal
numbers (use 4 bits per decimal digit, but only use the numeric values
0-9). A bit of a kludge, and still using binary coding for the digits,
but the arithmetic etc was all done in base 10. I gather it was
regarded as essential in the banking industry at one time - it avoided
rounding errors implicit in binary computation. Most of the old IBM
Fortran IV (G, H and H(ext)) and 66 compilers supported it.

I haven't seen it for a very long time, but it certainly existed!

I believe it's still used in some pocket calculators.

Bye.
Jasen
Jan 21 '06 #19
On 2006-01-21, John W. Kennedy <jw*****@attglobal.net> wrote:
Older computers used other means to achieve decimal:

Excess-3. Like BCD, but 0-9 are encoded as BCD 3-12 (it slightly
simplifies subtraction).
used for serial links mainly IIRC
Three-of-five: five bits, three 1, two 0. (Only ten of the 32 possible
five-bit figures are available.)
ISTR used for bar codes,
Actually, the i86 architecture supports BCD in a fossilized form dating
back to Intel's original intention of putting the 4004 chip into desk
calculators. The instructions are only helper instructions that need to
be put into subroutines, and I don't know of any compiler that uses them
to implement decimal arithmetic, but they're there.

Most processors I've encountered have something like that, if BCD is part of
Ada I'd be suprised if an Ada compiler didn't emit those instructions.
Bye.
Jasen
Jan 21 '06 #20
Jasen Betts wrote:
On 2006-01-21, John W. Kennedy <jw*****@attglobal.net> wrote:
Actually, the i86 architecture supports BCD in a fossilized form dating
back to Intel's original intention of putting the 4004 chip into desk
calculators. The instructions are only helper instructions that need to
be put into subroutines, and I don't know of any compiler that uses them
to implement decimal arithmetic, but they're there.
Most processors I've encountered have something like that, if BCD is part of
Ada I'd be suprised if an Ada compiler didn't emit those instructions.

Decimal arithmetic is part of Ada 95, but GNAT for Windows (the Ada
compiler built on GCC) implements decimal arithmetic as decimal-scaled
binary, rather than use the Intel decimal-help instructions.

--
John W. Kennedy
"But now is a new thing which is very old--
that the rich make themselves richer and not poorer,
which is the true Gospel, for the poor's sake."
-- Charles Williams. "Judgement at Chelmsford"
Jan 21 '06 #21
Martin Honnen wrote:

News wrote:
I want to display hex numbers in a <input type-"text:> how do I do it?

Such an input displays strings, to set the input use
inputElement.value = string
If you have a number e.g.
var n = 255;
and want to get the hexadecimal representation then you can use
var string = n.toString(16);
or
var string = n.toString(16).toUpperCase();
so perhaps
inputElement.value = n.toString(16).toUpperCase();
is what you want.

Hey, thanks Martin. I wanted a script to generate hex keys based on a
phrase entered by a user, here's (part of) what I came up with:

<script type="text/javascript">

function toHex(el)
{
var t = el.value;
var r = '';
for (var i=0, len=t.length; i<len; ++i){
r += t.charCodeAt(i).toString(16);
}
document.getElementById('hexVal').innerHTML = r;
}

</script>

<input type="text" onkeyup="toHex(this);">
<div id="hexVal"></div>
I prefer letters in the hex values to be lower case, it makes them stand
out more.
--
Rob
Jan 23 '06 #22

### This discussion thread is closed

Replies have been disabled for this discussion.