The recent buzz seems to be another egg war about young dev versus old dev. To add a little more noise to the Web, here is my two cents worth opinion.
That's quite funny to read about young prodigies without remembering that they represent only a tiny portion of the developper population, and conveniently avoid to mention counter examples of senior geniuses like: Ken Thompson developping UTF-8 when he was almost 50 years old and designing Go when he was around 60; Dennis Ritchie involved into the development of the Inferno operating system in his 50s; Brian Kernighan designing Awk when he was 46; Paul Bourke publishing several papers per year about computer vision and image analysis since the 1980s.
Inside those few exceptional minds who cleary populate all age ranges, the 99.9999% programmers falling into the "commonly dumb" category (according to my pub talk data survey agency) will pick their champion based on age proximity with great ease. Unfortunately for the youngest one, there is at least one study which doesn't seem to be in their favor. A dataset with one single point is worth what it is, so I'll avoid joining the confirmation bias herd and leave to the reader as homework the search for other serious studies.
I must nonetheless share the personal opinion I promised. My most cherished language is 50 years old and I'm nearer to the retirement than to the graduation, which clearly put me into the dinosaur cluster. Then, some may be surprised to read that since graduation I've been, I'm currently, and I'm willing to keep learning all along my career. It's been for example: on-the-job training about Oracle database and rocket physics while working on the Ariane5 space launcher at the beginning of my career; self-study of web programming later on; or more recently Python (cursed may it be, I swear I've been forced), various industrial cameras, lights and actuators controlling systems; etc... Currently I'm reading "C interfaces and implementations" by David Hanson, and I'm watching regularly videos from the GOTO conferences, Jacob Sorber, Coding Tech, ... channels. In the near future I'd like to add yet another string to my bow with another programming language. All along, the examples and advices from my elders have been as precious as the pressure to keep up-to-date from my juniors. For this I'm grateful to all of them.
Computer science is fascinating and I'm glad to have choosen a career in that field. If I could I would study everything about it, unfortunately it's such a wide field that it's absolutely impossible. Choices must be made about what to study and what to ignore. It's even worse due to the tremendous pace at which new technos appear in our field. We have no choice but to be honest and admit our ignorance about almost everything. Except those calling themselves a programmer after 'learning' a new language with a 30mn Youtube video, and project managers who are knee deep in the shit because they've accepted a project no one in the company has the skills for, anyone with a minimum of honesty should agree that it takes time to become an expert.
Students or those at the beginning of their career should learn the base of as many technos as possible I think. To make up their mind about what exists in the field, what they love and want to do, and open as many doors as possible when looking for their first job. But from mid-career, choosing a very small skillset and cultivate it over the years until one can call herself an expert, while at the same time acquiring a basic level in few other skills at slow pace, becomes the right strategy according to my personal opinion. Which one to choose is a matter of personal preference and there is definitely no right/wrong choice. There is need for all of them anyway, whatever the one you choose it will be qualified as "has-been" sooner or later, and changing focus every two weeks will doom yourself to be an eternal beginner. Which brings me to my last thoughts.
I think that there is also some dirty truth about some of those who embrace and promote changes at high frequency. When a new techno appears, everybody's a complete beginner. For someone with no competence, jumping onto a brand new techno about which no one else had time to build experience, it's a very convenient and easy way to artificially get at par with others. The trick won't last long as others will eventually build up new skills, but that's fine: at the pace at which new languages/frameworks/etc... appear, there will be soon enough another brand new wagon to jump into. This behaviour is not restricted to our field and I'm certain this psychological trait has a name but I'm unable to find it. The attraction power of novelty (1, 2) is also certainly to be taken into account when evaluating young/old devs and new/old technos, in particular its abuse for marketing or social recognition purposes.
In conclusion, I found the arguments for a superiority of young devs over old devs rather dubious. Most of devs are generally incompetent and ignorant except for some rare individuals, regardless of the age. However the advantage of the experience of old devs shouldn't be underestimated, as well as their will to keep learning, and they are the only ones who had enough time to become expert at something if any. On the contrary, the possible influence of cognitive bias related to the quality associated with young devs call for caution. A sound management should probably not put too much weight onto the age criteria and rather focus on building a diverse team where the strength of each individual cancels the weakness of others and create a positive synergy profitable to everyone.