During President Obama’s speech to Congress this week, I was heartened to hear his emphasis on the importance of education and his plans to reform education and to make college more accessible and affordable.Â In general, I think more education will always be a step in the right direction and I am sure that not a day goes by on Skepchick and similar sites where someone says ‘this is a problem that can be solved with better education.’
But it also got me thinking about my own education and the value of it. With a Bachelor’s and a Master’s degree, I went pretty far up the higher education ladder and I do have a good, well-paying job now.Â But I don’t actually believe that those two things are causal.Â My own personal experience was that college didn’t really prepare me for the real world and a lot of what I have accomplished has been through skills and experience I learned on the job. Plus, since I went to a private university, college was extremely expensive for my family.
Is college worth it? What are the factors that make it worthwhile?
Note: Both my degrees are non science and non technical – I’m a liberal arts major (English & Corporate Communication) all the way.Â I would never dispute that, for example, med school, wasn’t worth it. I’m talking more here about degrees in softer subjects. I’m looking at you, journalism and philosophy majors! :)