From a slashdot article on ageism and the IT world, I wrote:
I’ve thought long and hard about this issue. The problem is not one of age. It’s that of experience and familiarity with the latest technology/buzzwords. IN a college environment, students have the chance to play around with projects and to learn about their discipline full time. And if their degree plan is any good, they are probably learning concepts and languages which are not tied to any one proprietor. Midcareer professionals, on the other hand, basically are too busy to learn. Sure, they learn interesting things for the tasks specific to the job at hand, but that knowledge can be very domain-specific and not easily transferable to other companies. As for me, I am madly learning about new technologies, but my schedule and commitments limit me to only part time learning. When a part time learner competes against a full time learner, there is really no contest.
If you had a 40 year programmer, gave him 2 years to go to school full time, (paid for by parents) and use of the university’s great network, there would be no difference between the older and younger candidate. Having full time to learn at college provides an undeniable boost to any learner’s marketability, old or young.
Another thought. HR and recruiters use the “Must have 5 years experience” rule where they won’t consider candidates with less than 5 years experience. I actually wrote an essay about the Must-have-5-years-experience fallacybut actually this fallacy works in favor of older workers.