Crossfire Mailing List Archive
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: ASCII vs binary - holy war...



  From:  "Carl Edman" <>
  Date:  Thu, 14 Apr 94 00:11:11 -0400

  > So, tell me, what's the point in using ASCII?
  
  I've given the long list a number of times.  Please re-read the  
  articles.

the CPU spent by the client's parsing probably will be dwarfed by its
graphic display -- but the server will be doing a lot of it too.

fixed-length blocks will be the most common, since most of them will be 
simple MOVE commands.

the compressors won't work any better on ascii than binary -- it'll see
the same patterns in both.  but with binary it has less data to paw over.

  All that I can add is that I actually went and wrote a  
  binary protocol for a game like crossfire and a client and a server for  
  it.  My aversion to binary protocols stems from that real world  
  experience.
  
and my appreciation of them from also comes from real-world experience.

and i actually implemented the binary protocol for the X font server,
numerous X extensions, and am currently at work on the protocol for Low
Bandwidth X.  i've spent much of the last year worrying about how to
make network protocols work well over low speed/high latency links.

i think you're over-estimating the difficulty in writing a binary protocol
that doesn't run into padding or byte order problems.  your worries about
bugs will happen no matter what -- the choice of a network format will
have a minimal impact on the support problems of the overall product.


as i see it, the best argument you have for using ascii is that its
easier to debug, and you are willing to accept slowdowns everywhere else,
assuming they'll be in the noise, for the small help in debugging ascii
provides.

in the overall picture, the protocol format probably won't even matter.
but i hate to see a choice made on what i feel are bogus grounds.