It's hard to teach an introduction to programming without teaching a particular language. If you have to pick a particular language, it makes sense to choose something the student is likely to use in the future. This is not hard to understand.
That's silly. If you can reasonably claim to know how to program, the choice of language largely doesn't matter. Someone who knows how to program in [first language] should be able to pick up how to program in any language one is likely to encounter in an industry position.
So the first language should be one that gives the student a firm foundation from which one can become that competent.
I've personally seen excellent introductions to programming using C (CS50x), Python (Think Python, MIT 6.001), and Scheme (SICP).
My point is that you likely learned programming by learning the basics of file I/O in C or Python or something similar. You didn't sit down and learn all about automata theory etc. first. You need the context of what C.S. is used for before these things make sense to you, so you start by learning a bit of programming. Since you must be definition have A first language, why not pick one that is most likely to be used?
I don't really approve of this view. CS students are going to be programming in current-gen languages like Java, Python, C/C++ and JS in industry anyway (C is perhaps last-gen...). I think academia should lie a couple steps ahead and introduce students to stuff that will help them advance the industry. Next-gen and maybe experimental/academic languages and tools. Teaching students current industry languages/tools is good for the individual student but bad for the industry as a whole because it causes stagnation. Universities are big enough that they ought to be able to rise above this tragedy of the commons and do something for the greater good rather than think narrowly of the individual student.