[Home] [Groups] - Message: [Prev in Group] [Next in Group]

nu.kanga.list.mud-dev

6079: Re: [MUD-Dev] Re: MUD Development Digest

[Full Header] [Plain Text]
From: "Chris Gray" <cg@ami-cg.GraySage.Edmonton.AB.CA>
Newsgroups: nu.kanga.list.mud-dev
Date: Sat, 4 Apr 98 12:16:27 MST
Organization: Kanga.Nu
:I presume the comments about disk-based muds running faster than 
:memory-based ones are including the tacit assumption that one is talking 
:about muds that don't have the option of running on a machine with a 
:large surplus of RAM?

It depends on what you want to do. If your servers are only at large
central sites, then requiring a Gig of RAM for the server to have the
whole of a very large world in memory can make sense. For hobby projects,
however, the servers run on many different machines, so that may not be
an option. I think many MUD writers dream of having a very large world,
but actually don't. So, they do disk-based to keep the dream alive. That's
me, anyway!

:                                                 I also don't do a bunch 
:of dynamic stuff - I prefer to do all mallocs and loading of maps and 
:objects at startup, and keep it there.  I always believe malloc and free 
:are the devil's tools for fragmenting heaps and bloating memory usage, 
:if used during runtime rather than only at startup and shutdown.

Wow! You're almost more reactionary than me. My hobby programming started
with CP/M systems, where trying to use malloc/free was kind of pointless.
So, my compiler (a variant of which is still used to compile my MUD) on
CP/M did no dynamic allocation, and neither did an Ultima-style system
that I wrote. My current MUD does lots of it, but buffers itself from
the system routines with a layer of its own. This allows it to be given
a "maximum memory goal", which it will try to keep itself within, by
removing things from various caches (e.g. the MUD-code cache) if needed.

:Anyway, my server can currently handle over 150 people quite well with a 
:memory footprint of under 32 megabytes.

Well, I soon should be getting a hot new machine, and it should be able to
run that number of users. I doubt if it will use more than that amount of
memory - likely a lot less. On Amiga's, I've seen my usage go up to about
3 Meg for a half-dozen users, but the increment per user is very small
(really just a client structure, plus a shared copy of any agent program
and whatever space it uses). I still have a disk-based system because I
dream of having a huge world, and because I really, really like the whole
notion of a persistent programming language driving a 100% persistent world.
(Absolutely no information visible to a player is lost on a proper reboot.)

--
Chris Gray   cg@ami-cg.GraySage.Edmonton.AB.CA