Keith Thompson wrote:
August Karlstrom <fu********@comhem.se> writes: Diomidis Spinellis wrote: August Karlstrom wrote:
Does anyone know why some of the functions in time.h use pointers
to constant objects of type time_t when time_t is an aritmetic
type. Why is e.g. ctime declared as
char* ctime(const time_t* tp);
and not as
char* ctime(time_t t);
My guess is that this is an implementation decision related to the
environment where C has its roots. In the Seventh Edition Unix (and
probably also in earlier versions), time_t is implemented as a long
<http://minnie.tuhs.org/UnixTree/V7/usr/include/sys/types.h.html>.
On a PDP-11 where that 1979 version of Unix run, passing as an
argument a 16 bit pointer to a 32 bit long was probably more
efficient than passing the actual 32 bit value.
[...] I *think* that on some early implementations, 32-bit ints weren't
directly supported, and what's now a ftime_t was probably defined as
an array of two ints. The declaration of time() may have been
something like:
time(int t[2]);
You are absolutely right. In the Third Edition Unix (February 1973) the
time(2) interface is specified in assembly language: the 32-bit result
is returned in the register pair r0/r1
<http://minnie.tuhs.org/UnixTree/V3/usr/man/man2/time.2.html>. In the
Fourth Edition Unix (November 1973), which was (re)written in C, time
takes as an argument an int tvec[2]
<http://minnie.tuhs.org/UnixTree/V4/usr/man/man2/time.2.html>. Ritchie
mentions that the long type was added to C during the period 1973-1980
<http://cm.bell-labs.com/cm/cs/who/dmr/chist.html>. The long type
certainly wasn't supported by the C compiler that came with the Fifth
Edition Unix that was released on June 1974
<http://minnie.tuhs.org/UnixTree/V5/usr/c/c00.c.html>. Therefore, when
time(2) was first specified in C, an int[2] argument was a reasonable
interface specification.
--
Diomidis Spinellis
Code Quality: The Open Source Perspective (Addison-Wesley 2006)
http://www.spinellis.gr/codequality?clc