[Gambas-user] C Code character manipulation - alternatives

Doriano Blengino doriano.blengino at ...1909...
Sat May 23 09:53:13 CEST 2009


nando ha scritto:
> Technically, C doesn't 'put' garbage values into variables.
> Those values just happen to be there if you don't set it to something.
>
> Gambas wonderfully sets numeric variables to zero.
> You can think of it as NULL, I suppose, but ZERO is a numeric
> description dealing with numbers whereas NULL is usually referred
> to as something to do with characters,
> but NULL it is the value ZERO...so it all means the same
> except how humans talk about it.
>   
Because we are on the way to analyze all these such wonderful things 
about language and compilers, I want to add my contribute.

In several messages before, in this list, I found discussions about 
zero, null, initialization and so on. NUL (with only a single "L") is 
the first ascii character, and has number 0 (in 8 bit). NULL (with two 
"L"s) is a C word which means a pointer pointing to anything. Very often 
the NULL pointer is represented with zeroes, and very often (but not 
always) pointers are more than 8 bit long. The fact C lets you use a 
numeral "0" to indicate a NULL is a C weakness from my point of view. 
Pascal instead uses NIL and, on some (old) architectures, both the 
compiler and the runtime were able to detect uninitialized pointer this 
way. There have been around CPUs having registers with an additional bit 
to indicate wheter the register was correctly loaded or not. So, NIL was 
not "0", but a special value indicating "invalid", in a way similar to 
the special floating value "NaN" (not a number), which is not zero, and 
not infinite, neither any other number but "an invalid number". Dividing 
0 by 0 gives a NaN. Who says that a pointer of value 0 is an invalid 
pointer? It depends on the CPU; for example, the glorious 8051 had 
memory mapped at address 0, so NULL was a legal value for a pointer.


>
>> Either I am wrong or u r:
>>
>>    DIM k, countVar AS Integer
>>    PRINT k, countVar
>>
>> The output of these lines is:
>> 0       0
>>
>> How come u say that both integers are initialized to NULL.
>>
>> This is a super feature of Gambas versus C, which puts garbage values into
>> variables of any type.
>>     

This is not a super feature of gambas, it is simply a different 
behaviour to make gambas more comfortable than C. The reason why C does 
not initialize variables to 0, is to gain speed. If you have 4096 
variables on the stack (extreme example), gambas will be about 4096 
times slower than C because it has to set them to zero. With a single 
CPU instruction C creates how many variables are needed. Pascal goes one 
step further - even its global variables are left uninitialized, while C 
sets them to 0. This is another speed gain (very little, though).

Neither gambas neither C/pascal are the best languages - every one has 
its features and behaviour. But (I am referring now to another message), 
Borland compilers are very mature, and they give you messages, hints, 
warnings and errors. Gambas is much younger and differently targeted; in 
the 90% of cases, who cares if a variable is declared and never used? 
Are we low in memory? Much more important is the automatic memory 
management of gambas, for example, which lacks both in traditional C and 
pascal. But if I am writing a real-time operating system for embedded 
devices, then I would choose pascal, because I can control with it every 
single byte of my scarce memory and I know how long takes a statement to 
execute. But I also have to initialize everything and adhere to its 
strict syntax. For a linux desktop and a mean graphical application, 
gambas is perfect.

In some message some day ago, KhurramM proposed a single package of 
gambas for linux, and someone else replied that a source distribution is 
the more practical one. It is true - sadly. I think this situation is 
bad - remember, Unix means "unique, one for all". As long as the 
architecture does not change, it would be very practical to have binary 
packages for all the distributions - instead, linux on PC is a mess. I 
used to compile my kernel every time, on every new machine. Then, I 
discovered that there was nothing to gain for my average desktop machine 
- it was exactly the same to compile my customized kernel or to use the 
full bloated one which came with the distibution. If it works for the 
kernel, it could work any other application. But every distribution 
creator think he is doing better than the other, and the more it does 
different, better it is. Simply wrong. The author of the message than 
spoke about windows '98, XP, 2000. Well, they are different operating 
system. But 99% of applications developed by me with delphi run smoothly 
on every windows machine I tried; the remaining 1% were secondary quirks 
easily solved. I think this is the target of Unix/Linux world. This is 
freedom, without having to mess around with makefiles and configure 
scripts that get bigger than the original source itself. I suspect that 
Benoit spends a lot of time to adapt the sources to all the different 
distributions, which all share the same kernel, same libraries, and all 
have a packaging system that keeps track of dependencies...

Best regards,

-- 
Doriano Blengino

"Listen twice before you speak.
This is why we have two ears, but only one mouth."





More information about the User mailing list