To be honest, I’ve never had much of a real opinion on this issue in the past, outside of thinking of it from a technical perspective. So, this post just really provided a good opportunity to actually discuss it. If I come off as opinionated about the issue, I don’t mean to, this is the most in-depth I’ve ever thought about it and my thoughts are coming on the fly.
Anyway, on to the show…
If you think about it, we are really just mere users ourselves, and ColdFusion-- it is our “program” that we use to accomplish our daily work.
Agreed, the distinction I was really trying to make was between what to expect from someone with knowledge of the inner workings of the computer and the mechanisms it executes vs someone who doesn’t. A doctor vs. patient kind of thing…
I think that’s where different philosophies polarize
I don’t think it’s a question of “At the end of the day, have you truly been more productive if all the capitalization in your code matches?” I think it’s more of a question of “At the end of the day, does your code run correctly and reliably on all intended platforms?” The cosmetic benefit of proper case matching isn’t important to me. What is important to me is the understanding and practice of habits that are critical to application compabitility and can cause huge problems when implemented “incorrectly”.
Also, I think we may have a bit of a divergence in the direction of our respective debates. On the one hand, we can discuss things like the variables in our code matching case, the names of application files, functions and methods being consistent, etc, and on the other hand we have to worry about things like calls FROM our chosen programming to the operating system, which doesn’t adhere to the case sensitivity profile of the specific language making the call. Even if the programming language doesn’t get confused, the system it runs on still can if the language provides data that references a file name that doesn’t exist because of case sensitivity. (Such as a user uploads MyPhoto.jpg and your application tells Linux to get myphoto.jpg when retrieving the photo for display via a web server with case sensitivity, or when retrieving information about the photo such as meta information, size, creating a thumbnail, etc)
The other side of the coin is user provided data. If a user uploads a file through a form with the name Photo.jpg and then uploads another with the name photo.jpg, if our application doesn’t make that distinction, then we could run into issues. If we allow the user to provide input that determines something like a file name or unique key value that could be typed incorrectly when ready to be retrieved, then we as the doctors need to make sure that our patients don’t make mistakes that endanger their experience.
So, I think this is a two-headed discussion. One about the internal code and conventions of an application and one about the conventions surrounding files, objects, variables, etc that are generated by or through the application. Application code vs. application data. Some data is handled or executed entirely within the scope of your application, some is not.
Yes, it is-- and shouldn’t it? I mean, your server doesn’t need CFML to render a webpage. All it needs is it’s CPU and an instruction set. You need CFML. It exists to make your job easier, not to simplify your computer’s job
I 100% agree that the language should make my job easier. I think that expressive programming is important, and a move toward more natural language in programming is the future, but I mostly meant this more in a kind of literal way. If I say var i = x + 1; using the valid + operator, I don’t expect my language to also know that I mean x + 1 when I say i = x plus 1; or i = x add one;
English doesn’t honor every Chinese character, phoneme, grammar and context, I don’t see why a computer language should do so either. There are a few ways to say “Hello” in English (Hello, Hi, Howdy, Greetings, etc) but we don’t adjust our entire mental process to recognize “Hello” in all forms in every language. Our language has rules, certain characters mean certain things, and that’s how we recognize them and can properly parse out information and meaning. Computer languages do the same thing and are just as specific with syntax and semantics, no matter how robust they are, they don’t encompass all scenarios. There is some restriction to it. They must abide by the processor’s rules, and it works on a purely numerical level, where an a is not A.
plus isn’t +, one isn’t 1, and the computer doesn’t see those values the same when it comes to the instruction set. Writing things like a binary search will really illustrate this problem pretty well. Of course, that’s a specific case, and I think I’m just being technical more than anything here, but I think that ultra high level languages have the potential to breed programmers who don’t really have to understand what the computer is actually doing underneath the high level language. This is much like training a bunch of doctors that don’t know what the medicine they are prescribing really does, but instead are just trained to prescribe it when they see certain symptons. Or like having an automechanic who replaces an entire malfunctioning part or system when all they need to do is replace a missing screw or close an open valve.
The holy grail of programming may be to create a language that can figure out what we mean, even when we don’t say it quite correctly, but until then, we have to make sure that we do say it correctly enough for both the compiler/interpreter and the system(s) the application will end up running on to make sense of it. This is the primary reason I have for supporting my view. I think people have the same kind of difference in opinion over things like unit testing as they have over case sensitivity, so I know this is all just my personal opinion, I don’t think this is objective fact.
Thanks for the discussion too! 
Personally, I don’t think that the internal variable names inside a compiled or interpreted program should have to be case sensitive, especially variables local to the application itself, and I love that I can get away with that for the most part in ColdFusion. I do think that anything accessible by the end user needs to carefully controlled by the application per the specifics of the systems the application intends to run on.