Upload and Download speed adjustment

Giuliano Chianelli giuliano at rogers.com
Sun Nov 21 11:28:00 EST 2004


Running a ut2004 server on my linux box (pclinuxos kernel 2.4.22)

does anyone know what to change to increase the upload/download speed for 
maps and textures when clients log on? Im running a VCTF style server with 
custom maps and velicles.

Thanking in advance for all your help.


----- Original Message ----- 
From: "Jeff Woods" <klaatu at fnordco.com>
To: <ut2004 at icculus.org>
Sent: Sunday, November 21, 2004 1:16 AM
Subject: Re: [ut2004] 64 bit client problems with downloads


> Look! Up in the sky!
>
> It's a struct! It's an array!
>
> It's BACK-SEAT CODERMAN!
>
> More powerful than an overloaded function! Faster than an improved bubble 
> sort! Able to malloc tall heaps in a single bound!
>
> BACK-SEAT CODERMAN! Debugging your programs from afar... FOR GREAT 
> JUSTICE!
>
>
> On Sat, 20 Nov 2004 15:10:46 -0800
> Tom Emerson <osnut at pacbell.net> wrote:
>
>> On Friday 19 November 2004 10:09 pm, Ryan C. Gordon wrote:
>> > There's a lot more involved than just reading bytes from a socket [...]
>>
>> > [...] it's both inaccurate and impolite to show up and tell someone
>> > that their bugs aren't "rocket science". [...]
>>
>> Well, it wasn't my intention to be rude, but from what you're saying it 
>> sounds
>> like it is a problem of perception -- namely, mine :)
>>
>> I perceive a "percentage complete" calculation to be relatively simple, 
>> and
>> that debugging it would take no more effort than to set a breakpoint on 
>> the
>> final result > 100.0 and "take a look" at the underlying variables at 
>> that
>> point, then backtrack to find out why they are incorrect.  Either the
>> calculation is incorrect or the values passed to the routine are bogus. 
>> I
>> find it hard to imagine how the calculation would be "wrong", so that 
>> leaves
>> incorrect inputs [the "GIGO" principle]
>>
>> What I gather you're saying is that figuring out why the numbers passed 
>> to the
>> routine are "wrong" is a bit harder than merely inspecting memory -- this 
>> is
>> just a guess, but perhaps you are passing a 32-bit pointer to a routine 
>> that
>> internally uses a 64-bit pointer and "whatever" resides in memory just 
>> prior
>> to the pointer doesn't cause the treated-as-64-bit pointer to point 
>> outside
>> of allocated memory.  [if it did, well, you'd have caught that a long 
>> time
>> ago I'd imagine ;) ]  Of course, the fact that the routine is using a 
>> 64-bit
>> pointer could be totally unexpected because when compiled for 32 bits, it
>> would be using a 32-bit pointer "as expected", and the program wouldn't
>> exhibit the incorrect behavior.
>>
>> The other point to this thread is "why does this bug exist in the first
>> place?"  An accountant once told me "if you're off by a penny, you might 
>> as
>> well be off by a million dollars" -- the fact that the program reaches 
>> 100%
>> AND THEN STARTS OVER should have been a "red flag" for the whole download
>> process -- something is causing the program to retry the download EVERY 
>> TIME,
>> and it may be as simple as forgetting to "close" the file initially
>> downloaded, so it "never" validates properly from the redirected 
>> host/server
>> (yeah, I'm guessing again :) but so long as I'm guessing, if it is indeed
>> related to whether or not the file is "closed" properly, perhaps there is 
>> a
>> race condition where you [or the "virtual machine"] believe the file to 
>> be
>> closed, but the "real" machine hasn't done that yet...)
>> 





More information about the ut2004 mailing list