Of late, there's this weird notion being propagated by "Human" Resources and Upper Management that money, in the form of salaries, bonuses, etc., is not important.
I think this started about two decades ago when we started hearing bushwa like "salaries are being frozen," or "there are no bonuses this year."
The trend accelerated when they started hiring interns at either no salary whatsoever or for minimum wage.
Almost every day you hear crap like people don't leave their jobs because of money, that other things, like compatibility with one's boss and satisfaction with one's accomplishments are more important.
In fact, the nation-wide trend against unions (which basically functioned to get their workers more money) baffles me.
I'll be blunt.
Money is fucking important.
I have kids in college and grad school.
A retirement to fund.
And then there's everything else I like to do.
They all take money.
If you think you can gull me with the promise of awards and satisfying assignments to accept starvation wages, you're wrong.
You've already stripped away office amenities.
You've nixed all perks.
You seat us like veal in a pen or sardines in a can (though you're too cheap to pack us in oil.)
And now you expect us to think money doesn't matter.
It does to me.