Received: from FRIGGA.CLAREMONT.EDU by karazm.math.UH.EDU with SMTP id AA23795 (5.65c/IDA-1.4.4 for ); Fri, 11 Oct 1991 16:31:25 -0500 Received: from HMCVAX.CLAREMONT.EDU by HMCVAX.CLAREMONT.EDU (PMDF #11000) id <01GBMFNDMGIO9S3RO4@HMCVAX.CLAREMONT.EDU>; Fri, 11 Oct 1991 14:24 PDT Date: Fri, 11 Oct 1991 14:24 PDT From: MATT CARPENTER Subject: Re: LPC! To: glove-list@karazm.math.uh.edu Message-Id: <01GBMFNDMGIO9S3RO4@HMCVAX.CLAREMONT.EDU> X-Vms-To: IN%"glove-list@karazm.math.uh.edu" In message <9110111902.AA15568@am.ucsc.edu> beeman@cats.UCSC.EDU writes: >Sleep deprivation produces miracles! Imagine being able to program objects >in LPC on an LP-Mud with datagloves in mind! Of course, you would need to >add graphics routines to the language, and you would need to change all the >text output to graphic or audio output, and the resulting project would no >doubt either swell up into several megs of code before there could even be >two functional rooms... Actually, I've been thinking about something like this for a while. If the user connects to the MUD from a personal computer, then the MUD wouldn't have to do much more than keep track of where the objects are (which is pretty much what they already do anyway, although it would also need to keep track of coordinates). All the graphics computations and stuff could be done by the user's computer. For instance, the MUD would just say something like "Object #1 has moved to x,y,z" and the user's computer, which already has the description data for object #1, generates the scene. Likewise, the user's computer translates the actions of the user (like moving a power glove around) into a similar form which is sent to the MUD. Of course this would all be rather crude, but it would be interesting to experiment with. >Let me know if you are crazy enough to try this. > >Adam Sure, I'm crazy enough. Anybody else interested, or have suggestions? Matt