Newsgroups: comp.lang.lisp
From: cyber_surfer@wildcard.demon.co.uk (Cyber Surfer)
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!pipex!demon!wildcard.demon.co.uk!cyber_surfer
Subject: Re: Why do people like C? (Was: Comparison: Beta - Lisp)
References: <CwJ82u.7nL@csfb1.fir.fbc.com> <Cwsvrs.Dtv@cogsci.ed.ac.uk> <780822729snz@wildcard.demon.co.uk> <Cx3zo5.5sE@cogsci.ed.ac.uk> <781301566snz@wildcard.demon.co.uk> <CxBFF1.28C@festival.ed.ac.uk>
Organization: The Wildcard Killer Butterfly Breeding Ground
Reply-To: cyber_surfer@wildcard.demon.co.uk
X-Newsreader: Demon Internet Simple News v1.27
Lines: 172
Date: Mon, 10 Oct 1994 08:40:00 +0000
Message-ID: <781778400snz@wildcard.demon.co.uk>
Sender: usenet@demon.co.uk

In article <CxBFF1.28C@festival.ed.ac.uk>
           jeff@festival.ed.ac.uk "J W Dalton" writes:

> I know all that.  But writing something that's exactly right
> in general, rather than describing a simple example, would take
> too long.  I assumed people could vary it to suit.

We're both being pedantic. ;-)
 
> Now, in fact, 32 bits in a longword does not imply 16-bit ints.
> Think of a VAX.  "Longword" is VAX terminology.  "Word" continued
> to mean 16-bits, as on a PDP-11.

That's my point. I try to make as few assumptions as possible.
When it's necessary, I prefer to document the assumptions.

> Well, what are the hardware models people keep talking about.
> I still await an explanation.

I try to avoid hardware models, as I prefer more abstract models
of mechine behaviour to exact specifications of "word" lengths etc.

> The idea that hardware fits C seems rather odd to me.  For
> instance, Basic's "if <cond> then <line number>" is closer to
> what the hardware's like.

Yes, and now fewer and fewer Basics use line numbers. The model used
for the "Hope Machine" by Roger Bailey in his hope tutorial avoids
a lot of aspects of a CPU "model", and goes for something much more
abstract.

> Note too that the hardware's data is (usally) typeless.  Whether
> some bit are a float or a pointer depends on what instructions
> are used to operate on them.  This is similar to what happens
> in an "everything is a list" Lisp.  This analogy was frequently
> made in the past, but went out of favor as the various freedoms
> of assembly code (self-modifying, typeless) were increasingly
> seen as "bad".

This also depends on how close your model is to a model of a CPU.
A register might be typeless, but you could also think of it as
a bit-value, as in BCPL or Forth. Some languages allow a more
abstract model of objects. In a suitably strongly typed language,
you could still use types at the lowest level in the model.

> C code can't distinguish either.  All the type info is compiled
> away, as you no doubt know.  I don't really understand what you're
> getting at here.

I'm saying that the nature of the pointers, segmented or "flat",
is not a language feature. It's usually a CPU feature. That makes
it possible to write code in C that will work with either kind
of pointer, but it also allows you to write code that depends on
a single kind of pointer. Can the same thing be done in CLISP,
for example? XLISP? MIT Scheme? I don't know. I don't even know
if you'd need to, but I certainly don't want to.

> >Yes, but 3 MB is too big for me. What file format is it in, anyway?
> 
> LaTex and .dvi.

Thanks. I'm not able to read either of those formats yet.

> Just so.  What I find interesting is that arrays were treated so
> differently from strings.

In the early Basics that I've used, there could a _lot_ of
variation from one implementation to another. They were often
different dialects, as well as different compiler/interpreters.

> Lisp languages/implementations.  But for both Basic and Lisp,
> implementors seldom felt constrained to follow such descriptions.
> I find this interesting as well and would be interested in comments.

So would I. What pressure is there today to conform to a
standard (for whatever language you choose, but let's choose
Lisp for the moment)?

It seems that CL vendors are still adding features that aren't
in the ANSI CL standard, and some of those features, like the
LOOP extensions, have been added to the standard. At what point
does this stop, if ever? What happens to CL if vendors move on,
and the standard doesn't? I could also ask who is still using
Standard Lisp today, but Cambridge Lisp is the only implemention
I've seen so far.

> I suspect that the ease of implementing Basic and Lisp was an
> important factor.  That is, it was fairly easy to implement a
> Basic or a Lisp.  Why try to match another exactly when you
> could make your task easier -- or add your own neat features --
> by inventing a variant?

Exactly. It was similar for Forth. I used Hyper Forth+ for a
while, and it was _very_ non-standard, even a few years after
Forth-79 appeared. I showed it to a salesman who sold PolyForth,
and he looked a little sick. It must have been the multitasking
demo, which was rather slow on an 8 Mhz 68K! I don't think he
was impressed.

Still, it was a fine development system, as long as you were
happy with all the differences from the "standard" dialects.
(Forth 79 was just the official one, but Fig-Forth was still
popular at that time.)

> But some people have, evidently, encountered hardware-level
> stuff rather early-on.  I haven't seen this Hope tutorial
> (is it available on the net?) but it sounds like the kind
> of thing that appeals to the mathematically inclined but
> perhaps not the the raised-on-hardware types?

Bailey has a tutorial on the WWW, but I haven't yet compared
it with his book. Here're URLs for the tutorial and for Hope:

ftp://santos.doc.ic.ac.uk/pub/papers/R.Paterson/hope/hope_tut/hope_tut.html
http://santos.doc.ic.ac.uk/pub/papers/R.Paterson/hope/index.html

> Why did you care what the machine was doing?  I'm not saying
> you shouldn't, just asking the reasons.

It was a 16K TRS-80. You can to learn that stuff to do anything
interesting, even in Basic. Even when I expaned the machine to
48K and added some floppy drives, it was still necessary to know
what about the CPU and play with hex codes, just to do some tasks
that had little to do with programming.

> I'm glad I learned assembler for a couple of machines, but I
> don't have much desire to do the same again, and I've never
> cared very much about the hardware details (logic circuits
> and the like).

I know how you feel! Life is too short for that. I don't mind
reading about it, or even getting my hands dirty when there's
no other way of doing something, but it's so easy to avoid all
that these days. I suspect that the only reason I _still_ need
to get down to that level is because not everyone thinks that
programmer time can be used better.

For example, I might still find myself using a list of hex
addresses for C functions in order to discover when a program
crashed. I can't just use the symbolic name without also adding
the symbolic info to a binary, and then running it with a
debugger. I even used to have to read mangled C++ names,
until a certain linker was updated so it could do that itself.
It now gives me error msgs with unmangled names, which makes
life so much easier.

> Can you remember what the FFI looked like?  Could you manipulate
> C structs in Lisp or what?

I recall that it could do all that, perhaps using something
like PEEK and POKE. I know that that's how it was done in
Basic on my first machine, and the code was called with a
function called something like USR, with the entry address
as the argument. It looked a lot cleaner in Cambridge Lisp,
of course.

> BTW, I find it amusing that "foreign" seems to mean "C"
> these days (though perhaps not in _your_ usage).

As I'm currently using Windows, this is easy. It has a standard
linkage for PASCAL and C. Most of Windows uses the PASCAL linkage.
Actually linking addresses at runtime is done by Windows itself,
which makes it as simple as you could hope for it to be. The only
problems that I know of seem to be that Borland and Microsoft
can't agree on how to return objects like doubles from functions,
but there's a simple "fix" for that - just look at what the
machine does at the stack level, and alter the prototype for
your function appropriately. It's not pretty, but it's simple
and it works, apparently.
-- 
"Internet? What's that?" -- Simon "CompuServe" Bates
http://cyber.sfgate.com/examiner/people/surfer.html
