Kenjis Kaan <ti**********@canada.com> wrote:
(or google) the folks on comp.unix.programmer why crypt(3) isn't exactly
meant to be a generic (or even useful anymore) "encryption" algorithm.
^^^^^^^^^^^^^^^^^^^^^^^
- Bill
Why is it not useful anymore?? why?
Alright, I'll bite. I wrote this last night, but decided not to post. I'm by
no means an expert, nor even possibly a novice in cryptography, but it pains
me to see people ignorant of this stuff... ;) Granted, I'm assuming there
that when you say "crypt function", you mean crypt(3), the Unix interface.
crypt(3) is an historic Unix interface. For the most part, its obsolete.
crypt(3) doesn't "encrypt". It is a one-way hash. The historic
implementation used DES (a block cipher algorIthm), however the password was
used as the key to encrypt something known (like all zeroes). The purpose
was to keep the user's password, transformed so that a plaintext password
could be matched against the stored transformed password, w/o having to keep
the plaintext password in storage (and so exposed). Indeed, in Linux's
glibc, the latest version can use MD5, a purpose built one-way hash
function, instead of DES.
I recommend you read the sci.crypt FAQ first. Playing w/ cryptologic
primitives is like playing w/ fire, except it can take a really long time
before you realize you've burned yourself, and by then its way too late to
remedy. Learn what a cipher is, a [one-way, cryptographic] hash, cipher
modes, key transformations, et al. Then consult comp.unix.programmer and
your preferred local Win32 newsgroup for implementations to use.
If you then find yourself having problems using one of those
implementations, and you think your issue might be pertinent to standard C,
folks here can help. Depending on what you're doing, you often have to be
careful w/ the data types you use (unsigned char vs char, int vs long, etc).
These can make huge differences w/ many crypto implementations, and its
those standard C issues which you'll be able to find quality answers for in
this group. Just the other day I found a bug in Apache's SHA1 implementation
which assumed a long was 32-bits, and the effects on my 64-bit Alpha were
interesting to say the least.