I have a degree in Sociology, and I am trying to go back to school for Nursing. However, everybody around me is sort of discouraging about the whole 'Obamacare' since it is supposed to lower down the income of the nurses. I keep hearing about the new grads having trouble to find jobs. I am certainly not trying to be in the field for money. However, would surely not want to be in a field that would fetch meager income with the burden of student loans.
Would love to know your views on it! Thank you very much in advance