With most careers, you can't get a job without experience. But where do you get that experience?
For a while, higher education was the obvious way to go right after high school. But in today's economic climate, is that still true? There's something to be said for those who choose to jump right into the workforce, and there are companies out there which will pay for or provide stipends for higher education for employees who have been there a certain amount of time.
Considering the ever-increasing cost of higher education, and the student loans which many college kids amass while working for their various degrees, do you feel the economy actually turned the tide and made job experience just as or more valuable than higher education?