95% of the time, I couldn't care less what a variable's type is, which is a large part of the reason I despise Hungarian notation (*twitch*, *twitch*, *bleh*, I feel dirty just saying it

).
I don't know, maybe it's just something that comes with programming for 30 years (20 of which in C or C++), in a variety of fields (2d and 3d graphics, 2d and 3d physics, compilers, networking). After a while, beyond telling the compiler how the data is to be handled (eg, int vs float), type becomes meaningless. I don't think in terms of pointers and structs, I think in terms of "I've got this thing here, with these named features". Might be part of why I always get my . and -> mixed up

(another part is python and qfcc (I didn't bother with ->, though I might add it as an option)).
Having to think about type ('\0' for char, 0 for int, 0.0 for float, NULL for pointers) would only get in the way (ouch, my head hurt just filling in the examples), just like having to get . and -> right does (and I read somewhere that either Kernighan or Ritchie (don't remember) regrets ->). I am very grateful that C always treats 0 of any type as false and non-zero of any type as true, and that just a bare 0 works for any non-aggregate type. The compiler gets out of the way and lets the programmer concentrate on the problem the code is to solve rather than futzing with trivial details.
Actually, I have a good argument
against using '\0': it can be visually confused with '0', or worse, '10' (sure, gcc will warn about that one, but...). I think I read that somewhere in some style guide. Oh, and a second one: what is '\013'?
