I have a question for everyone in every country, id like to get the most honest answers...
Is having a respectable job more important than the wages you are paid?
For example... I went to college for 4 years studied and trained as an animal health tech, I received my certification, in my area those positions are filled so I ventured elsewhere I needed to pay my bills.. okay after all this time I got on the job training and have move my level up to management in a Walmart Neighborhood Market, im making a salary higher than what i'd make as a Animal Health tech, a position opened and I declined, staying at my current job, when I say im a manager at a walmart I don't get any respect lol as I would if I were to say im an Animal Health Tech.. another example, a trash man saying he works for a trash company to an insurance agent, the trash man makes more money, is steady and secure with income and is job protected through the union, an insurance agent is a respectable job, but the trashmans money is double his...
What's more important to you?
Readers of this topic also read:
lcb activities in the last 24 hours
Join the club
- new members
- members online
- guests online
- new posts
- free games played
Join today and start earning rewards
You will immediately get full access to our online casino forum/chat plus receive our newsletter with news & exclusive bonuses every month.