Newsgroups: gnu.misc.discuss,comp.lang.tcl,comp.lang.scheme
Path: cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!news2.near.net!MathWorks.Com!europa.eng.gtefsd.com!howland.reston.ans.net!news.sprintlink.net!uunet!sytex!smcl
From: smcl@sytex.com (Scott McLoughlin)
Subject: Re: What should an alternative look like? was Re: Why you should not
Message-ID: <kD30sc2w165w@sytex.com>
Sender: bbs@sytex.com
Organization: Sytex Access Ltd.
References: <366u7n$9b7@agate.berkeley.edu>
Date: Mon, 26 Sep 1994 20:49:44 GMT
Lines: 71
Xref: glinda.oz.cs.cmu.edu gnu.misc.discuss:18299 comp.lang.tcl:19312 comp.lang.scheme:9917

nweaver@madrone.CS.Berkeley.EDU (Nicholas C. Weaver) writes:

> 
> 	So that leads to the following question:  how Scheme-like should
> such a mythical extention language be?
> 
> 	Syntax flames aside, I wonder about the following issues as well:
> 
> 	Would it be appropriate in such a context to restirct call/cc to non
> local exits only?  (Yes, it would no longer be Scheme, just Scheme-like)

Yup. call/cc is killer for a scripting language. Not going to
work _all_ of the time. Too much for users to worry about.
Plain truth. Let's make tools that are easy to use and work
today. Save the rocket science for the lab.
> 
> 	Would a byte code interpreter be fast enough, or would a more
> platform specific compiler be necessary, giving machine-code output?
Yup. For an "extension language" skip native code. BC is plenty fast.
Direct threading can make it faster.  Lots of folks us BC compiled
languages for _real applications_: dBase, Visual Basic, SmallTalk...
Native code is over-rated for about 85% of the apps out there.
> 
> 	How small could such an interpreter/compiler be made and still run
> fast enough?

We've got a whole mess of CL (there's still a whole mess missing) in
our LinkLisp(tm) product coming soon - ~250K kernel implemented as a
shared library (Windows .DLL). This doesn't count the Lisp "standard
library", but source is there and can be pruned.  Non-prunables are
documented as such.
> 
> 	Could it run within a factor of 100 of C?  Factor of 10?

Of course it _could_. We'll soon start honoring declarations. Don't 
know if it'll make it into v1.0.  But our motto for embedded Lisp
is "Render unto C what C is due." where "due" is defined by _what
typical non-academic user want or percieve_. Quixote's we're not.
Better spending one's time implemented _very_ friendly interface
to C (finalization on C pointers, integrated error handling, etc.)
> 
> 	What sort of module system could be added?  Macro system?  Are there
> some good examples already available?  Is there a move to formalize R4RS
> macros somewhere?

We support Lisp packages.  For now, it's standardized and good enough.
We support Lisp defmacro's with normal lambda-list syntax. Don't have
time right now for "destructuring bind". Next version, maybe. We're
starting to have to make footprint decisions.
> -- 
> Nicholas C. Weaver			nweaver@orodruin.cs.berkeley.edu
> It is a tale, told by an idiot, full of sound and fury, .signifying nothing
> Fun with anagrams:  computer science -> coerce inept scum

Don't think too hard. Write something and then try to use it. If
you have problems, it's not good enough. Then give it to 
colleagues. If they don't like it, it's not good enough. Proof
is in the pudding.

P.S. I _love_ the work produced by the Lisp research community. I
just think we've got to start segregating the research from the
commercial world.  Novell now hawks Unix in the marketplace and
Bell Labs hacks Plan 9. That's as it should be. If we believe 
the "20 year rule", the commerical world should be hacking out
stable Lisp systems based on ~20 year old techniques. Just bang
them out and make them available and affordable.

=============================================
Scott McLoughlin
Conscious Computing
=============================================
