Newsgroups: comp.lang.lisp
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!europa.eng.gtefsd.com!gatech!news-feed-1.peachnet.edu!darwin.sura.net!lhc!lhc!hunter
From: hunter@work.nlm.nih.gov (Larry Hunter)
Subject: Re: Which machines are best for common lisp?
In-Reply-To: gadbois@cs.utexas.edu's message of 31 Oct 1994 20:28:37 -0600
Message-ID: <HUNTER.94Nov1150133@work.nlm.nih.gov>
Sender: news@nlm.nih.gov
Reply-To: Hunter@nlm.nih.gov
Organization: National Library of Medicine
References: <387pp2$7nf@sulawesi.lerc.nasa.gov> <HUNTER.94Oct27114421@work.nlm.nih.gov>
	<fkilpatr.4.19EB2EB9@afit.af.mil> <39494l$oq@peaches.cs.utexas.edu>
Date: 01 Nov 1994 20:01:33 GMT
Lines: 54


Alex Kilpatrick (from Wright Patternson Air Force Base) responds to my
statement (about lisp platforms): 

  I'm running Franz Allegro CL 4.2 on an SGI Indigo^2, with the 150MHz R4400
  CPU with 96M ram. [...]

with:

  96 Meg of RAM!!!???  Oh my.  That's outrageous.  I'm running Franz Allegro
  CL\PC under Windows NT, with 32 Meg of RAM, and I don't seem to ever swap.

Well, what are you doing with your LISP?  Some applications require more
memory than others.  I'm sure the folks using lisp-based planning systems
for transportation and logistics planning at WPAFB are running them on quite
large machines. 

Personally, I'm doing experiments where I create populations of dozens (soon
to be hundreds) of individuals, each embodying one of many different
algorithms, paramenter settings, representations and so on, all of which
compete and communicate with each other.  [I call this coevolution learning,
and it is a based on a model of interacting genetic and cultural evolution.]
In addition, each of these learning systems is trying to learn over quite
complex (and often large) biomedical datasets, which are represented in
multiple different ways.  I've done a lot of work to make this rather large
system both modular enough to maintain and extend, and efficient enough (of
both time and space) to run acceptably on the fairly generous hardware I'm
using.  As I said, moving from 64M to 96M of RAM made a huge difference.  As
the system continues to grow (encompassing more ml methods, larger
populations and harder problems), more RAM is the primary item on my wish
list -- a faster CPU will make much less of a difference.

ObC/C++Slam: Since I had to recently learn C++ in order to add one of the
learners to my system, I feel the need to complain about what a miserable
excuse for an object oriented programming language I think C++ is.  Even
compared to CL/CLOS, it lacks clean abstractions, multiple namespaces, the
ability to change class internals easily without effecting the rest of the
universe, and good, built-in memory management/gc, all of which are
absolutely necessary for building something like this with the resources I
have (lots of cycles and not a lot of human help). This is especially true
in the exploratory phase of design and implementation of a complex system,
which is, by definition, rapidly changing.

Larry

--
Lawrence Hunter, PhD.
National Library of Medicine
Bldg. 38A, 9th floor
Bethesda. MD 20894 USA
tel: +1 (301) 496-9300
fax: +1 (301) 496-0673 
internet: hunter@nlm.nih.gov
encryption: RIPEM via server; PGP via "finger hunter@work.nlm.nih.gov"
