[Gambas-user] fastest way to "zero-out" a huge binary file

Doriano Blengino doriano.blengino at ...1909...
Sun Jan 24 10:04:49 CET 2010


kevinfishburne ha scritto:
> Kadaitcha Man wrote:
>   
>> Have you tried using a Long instead of an Integer?
>>
>> <runs and hides>
>>
>> :)
>>
>>     
>
> Haha, that gave me a good laugh. When I discovered the "bug" was just me
> being a jackass I practically did the Snoopy dance I was so overjoyed.
> Simple problems are always the best once you finally figure out they were
> simple. Now if I can just get rid of those annoying extra "string length"
> bytes it keeps writing...
>   
Read carefully the documentation about the WRITE instruction - in the 
first few lines you find the explanation. When you write something out, 
you can specify the length of the data you write. If you don't specify 
it, gambas will do it for you, and this is what you don't want.

I don't know the algorithm you want to use on so many data, but if you 
look deep at it, may be you find some shortcut to minimize read and 
write from a file. You could read data in chunks, calculate intermediate 
results in ram, and so on. If you try to randomly read 65537*65537 small 
pieces of data, your program will take forever to run, because the 
overhead of input/output will be overimposed on every two bytes. By 
reading 4 bytes at a time instead of two, you will gain twice the speed, 
and so on. If the calculus is heavy, then there will be a point of best 
compromise between the heavyness of I/O and heavyness of calculus. If 
the calculus is very-very-very heavy, then the I/O time will be neglectable.

Another nice thing could be to compress data to and from disk, if data 
can be managed by blocks. You could even reduce data to 1/10, and this 
would be a big improvement.

Regards,

-- 
Doriano Blengino

"Listen twice before you speak.
This is why we have two ears, but only one mouth."





More information about the User mailing list