By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,230 Members | 2,471 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,230 IT Pros & Developers. It's quick & easy.

How to convert an integer to ASCII character ?

P: n/a
What is the most easiest way to convert an integer value to ASCII
character format ?
I tried with sprintf(). It works.
Is there any other way to do that ?

Objective::
I like to convert an integer value of 3 and write into a string buffer.

What I did:
.....
.....
char myStr[];
int myInt = 3;
sprintf(myStr, %d,myInt);
.....
.....

Please comment.

Feb 19 '06 #1
Share this Question
Share on Google+
16 Replies


P: n/a
ak**************@gmail.com wrote:
What is the most easiest way to convert an integer value to ASCII
character format ?
I tried with sprintf(). It works.
Is there any other way to do that ?

Objective::
I like to convert an integer value of 3 and write into a string buffer.

What I did:
....
....
char myStr[];
int myInt = 3;
sprintf(myStr, %d,myInt);
....
....

Please comment.


#define DIGILEN log10 (MAX_INT) +2

char buf[DIGILEN];
sprintf(buf, "%d", int_var);

Xavier
Feb 19 '06 #2

P: n/a
serrand wrote:
ak**************@gmail.com wrote:
What is the most easiest way to convert an integer value to ASCII
character format ?
I tried with sprintf(). It works.
Is there any other way to do that ?

Objective::
I like to convert an integer value of 3 and write into a string buffer.

What I did:
....
....
char myStr[];
int myInt = 3;
sprintf(myStr, %d,myInt);
....
....

Please comment.


#define DIGILEN log10 (MAX_INT) +2

char buf[DIGILEN];
sprintf(buf, "%d", int_var);

Xavier


oops... sorry

#define DIGILEN (int)(log10 (MAX_INT) +3)

Your way seems to be the simpliest...

sprintf is doing the same job as printf : wheras printf outputs in stdin
sprintf outputs in its first argument, which have to be an allocated string

Xavier
Feb 19 '06 #3

P: n/a
OJ
Maybe itoa could be used. But it's not a standard function.

Feb 19 '06 #4

P: n/a
int i = 3;
char c = i + '0';

Feb 19 '06 #5

P: n/a
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

shichongdong wrote:
int i = 3;
char c = i + '0';


???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */

- --
Lew Pitcher

Master Codewright & JOAT-in-training | GPG public key available on request
Registered Linux User #112576 (http://counter.li.org/)
Slackware - Because I know what I'm doing.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.7 (GNU/Linux)

iD8DBQFD9+6zagVFX4UWr64RAlFEAJ9CA4LmNOY13Nry6tTDnT 5zrlcO3gCfU/hU
AGWsWWx0Al9HQaZtiP46Gas=
=OS08
-----END PGP SIGNATURE-----
Feb 19 '06 #6

P: n/a
ak**************@gmail.com wrote:

What is the most easiest way to convert an integer value to ASCII
character format ?
I tried with sprintf(). It works.
Is there any other way to do that ?

Objective::
I like to convert an integer value of 3 and write into a string buffer.


#include <stdio.h>

/* ---------------------- */

static void putdecimal(unsigned int v, char **s) {

if (v / 10) putdecimal(v/10, s);
*(*s)++ = (v % 10) + '0';
**s = '\0';
} /* putdecimal */

/* ---------------------- */

int main(void) {

char a[80];

char *t, *s = a;

t = s; putdecimal( 0, &t); puts(s);
t = s; putdecimal( 1, &t); puts(s);
t = s; putdecimal(-1, &t); puts(s);
t = s; putdecimal( 2, &t); puts(s);
t = s; putdecimal(23, &t); puts(s);
t = s; putdecimal(27, &t); puts(s);
return 0;
} /* main */

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson
More details at: <http://cfaj.freeshell.org/google/>
Also see <http://www.safalra.com/special/googlegroupsreply/>

Feb 19 '06 #7

P: n/a
ak**************@gmail.com wrote:
What is the most easiest way to convert an integer value to ASCII
character format ?
Lew Pitcher wrote: shichongdong wrote:
int i = 3;
char c = i + '0';


???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */


Interesting. Neither is the earlier code converting 3 guaranteed to
produce ASCII.

--
Thad
Feb 19 '06 #8

P: n/a
In article <j5*******************@news20.bellglobal.com>,
Lew Pitcher <lp******@sympatico.ca> wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

shichongdong wrote:
int i = 3;
char c = i + '0';


???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */


The OP was asking how to do it with 3, not 300. You need to keep up.

Feb 19 '06 #9

P: n/a
Lew Pitcher wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

shichongdong wrote:
int i = 3;
char c = i + '0';

???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */


It works for single digits, right?
Best regards / Med venlig hilsen
Martin Jørgense

--
---------------------------------------------------------------------------
Home of Martin Jørgensen - http://www.martinjoergensen.dk
Feb 19 '06 #10

P: n/a
"Martin Jørgensen" <un*********@spam.jay.net> wrote in message
news:pk************@news.tdc.dk...
Lew Pitcher wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

shichongdong wrote:
int i = 3;
char c = i + '0';

???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */


It works for single digits, right?


Assuming ASCII it does.
Feb 20 '06 #11

P: n/a
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

stathis gotsis wrote:
"Martin Jørgensen" <un*********@spam.jay.net> wrote in message
news:pk************@news.tdc.dk...
Lew Pitcher wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

shichongdong wrote:

int i = 3;
char c = i + '0';

???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */

It works for single digits, right?


Assuming ASCII it does.


Assuming any conforming C implementation, it does. The C standard guarantees it.

- --
Lew Pitcher

Master Codewright & JOAT-in-training | GPG public key available on request
Registered Linux User #112576 (http://counter.li.org/)
Slackware - Because I know what I'm doing.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.7 (GNU/Linux)

iD8DBQFD+QbaagVFX4UWr64RAkS4AJ9H8kT8tck4HFxxhC2f+x mDPRRu5QCgnbor
SX0i8pvlRNTifgvwU0h9Od0=
=YdQ6
-----END PGP SIGNATURE-----
Feb 20 '06 #12

P: n/a
Lew Pitcher <lp******@sympatico.ca> writes:
stathis gotsis wrote:
"Martin Jxrgensen" <un*********@spam.jay.net> wrote in message
news:pk************@news.tdc.dk...
Lew Pitcher wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

shichongdong wrote:

> int i = 3;
> char c = i + '0';

???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */
It works for single digits, right?


Assuming ASCII it does.


Assuming any conforming C implementation, it does. The C
standard guarantees it.


The C standard guarantees that decimal digits are sequential and
in the proper order. The C standard doesn't guarantee that the
execution character set is ASCII. The OP asked to convert an
integer value to *ASCII* character format specifically.

Here's a portable way to get a single ASCII digit: 48 + num.
--
int main(void){char p[]="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuv wxyz.\
\n",*q="kl BIcNBFr.NKEzjwCIxNJC";int i=sizeof p/2;char *strchr();int putchar(\
);while(*q){i+=strchr(p,*q++)-p;if(i>=(int)sizeof p)i-=sizeof p-1;putchar(p[i]\
);}return 0;}
Feb 20 '06 #13

P: n/a
stathis gotsis wrote:
"Martin Jørgensen" <un*********@spam.jay.net> wrote in message
news:pk************@news.tdc.dk...
Lew Pitcher wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

shichongdong wrote:

int i = 3;
char c = i + '0';

???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */

It works for single digits, right?


Assuming ASCII it does.


Assuming an implementation that conforms to the C standard it does,
whether it is ASCII or not. It's one of the few things the C standard
guarantees about the execution character set.
--
Flash Gordon
Living in interesting times.
Web site - http://home.flash-gordon.me.uk/
comp.lang.c posting guidlines and intro -
http://clc-wiki.net/wiki/Intro_to_clc
Feb 20 '06 #14

P: n/a
"Flash Gordon" <sp**@flash-gordon.me.uk> wrote in message
news:2n************@news.flash-gordon.me.uk...
stathis gotsis wrote:
"Martin Jørgensen" <un*********@spam.jay.net> wrote in message
news:pk************@news.tdc.dk...
Lew Pitcher wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

shichongdong wrote:

> int i = 3;
> char c = i + '0';

???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */
It works for single digits, right?


Assuming ASCII it does.


Assuming an implementation that conforms to the C standard it does,
whether it is ASCII or not. It's one of the few things the C standard
guarantees about the execution character set.


I was not aware of that, thanks for the correction.
Feb 20 '06 #15

P: n/a
Flash Gordon wrote:
stathis gotsis wrote:
"Martin Jørgensen" <un*********@spam.jay.net> wrote in message
news:pk************@news.tdc.dk...
Lew Pitcher wrote:
shichongdong wrote:

> int i = 3;
> char c = i + '0';

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */

It works for single digits, right?


Assuming ASCII it does.


Assuming an implementation that conforms to the C standard it does,
whether it is ASCII or not.


Check Ben's point elsewhere in the thread. The OP defined "works" as
producing ASCII. While the code in question produces the corresponding
digit character in the execution set, it only produces the correct ASCII
character if the execution set is ASCII.

--
Thad
Feb 20 '06 #16

P: n/a
On Sun, 19 Feb 2006 01:33:41 +0100, serrand <xa************@free.fr>
wrote:
serrand wrote:
#define DIGILEN log10 (MAX_INT) +2

char buf[DIGILEN];
sprintf(buf, "%d", int_var);

oops... sorry

#define DIGILEN (int)(log10 (MAX_INT) +3)
In C89 an array bound must be a constant expression, and no function
call, even to the standard library, qualifies. In C99 this is still
true for an object with static duration, but if your code snippet is
entirely within a function (not shown) and thus is automatic, this is
legal, though rather inefficient. A C89-legal and (probably) much more
efficient method is to approximate the digits needed for the maximum
value that could be represented in the object size:
sizeof(int)*CHAR_BIT * 10/3 + slop_as_needed
Your way seems to be the simpliest...

sprintf is doing the same job as printf : wheras printf outputs in stdin
stdout. Frequently stdin stdout and stderr are all the/an interactive
terminal or console or window or whatever, but they need not be.
sprintf outputs in its first argument, which have to be an allocated string


- David.Thompson1 at worldnet.att.net
Mar 3 '06 #17

This discussion thread is closed

Replies have been disabled for this discussion.