I've always thought college was a sham. I am almost approaching two full years after I have graduated and where is the benefit? Where is the glorious life I have made for myself? Where is the job security that I have given myself with a college degree. Where is the better paycheck I was promised?
I have seen too many people since I started college seven years ago, wrapped up in those lies. Not so much lies as misunderstandings. People staying in a major for some sort of future finacial benefit. People stay in majors for some future finacial security. People assume they can't get a job if they take this 'one class' that is supposed to be a clincher on the resume. People believe one major has a sociological benefit over another, or some type of hiring benefit. If that weren't enough we have the great cosmic educational joke of graduate school.
Here is a great plan, dumb down education to the piont where you are barely educated enough to get a part time job after you graduate college. When it comes to find a real job you are pointed back towards graduate school. Your university rejoices as they see more money comming their way. You go through graduate school, you become over educated and again can't get a job in the real world. They point you back towards the college and they jump up and down with glee as they got a subsidized employee who threw years of cash at the university and now they got the secured employee.
Sadly, part of me feels a necisity to go back to college, if not for some missed socialization, which is basically the only thing college is good for. I never understood the concept of majoring in something, and I don't know what graduate program to pursue. For those who will jump and tell me 'pursue what i love,' I'll stop you right there and say I really don't 'love' anything. I never liked what I got my degree in, in the first place. And I think throwing another 40,000 at a school just to make more friends and meet more women doing some useless graduate program or post bacc edcation is a major step towards misguidedness and a waste of money.
I have seen too many people since I started college seven years ago, wrapped up in those lies. Not so much lies as misunderstandings. People staying in a major for some sort of future finacial benefit. People stay in majors for some future finacial security. People assume they can't get a job if they take this 'one class' that is supposed to be a clincher on the resume. People believe one major has a sociological benefit over another, or some type of hiring benefit. If that weren't enough we have the great cosmic educational joke of graduate school.
Here is a great plan, dumb down education to the piont where you are barely educated enough to get a part time job after you graduate college. When it comes to find a real job you are pointed back towards graduate school. Your university rejoices as they see more money comming their way. You go through graduate school, you become over educated and again can't get a job in the real world. They point you back towards the college and they jump up and down with glee as they got a subsidized employee who threw years of cash at the university and now they got the secured employee.
Sadly, part of me feels a necisity to go back to college, if not for some missed socialization, which is basically the only thing college is good for. I never understood the concept of majoring in something, and I don't know what graduate program to pursue. For those who will jump and tell me 'pursue what i love,' I'll stop you right there and say I really don't 'love' anything. I never liked what I got my degree in, in the first place. And I think throwing another 40,000 at a school just to make more friends and meet more women doing some useless graduate program or post bacc edcation is a major step towards misguidedness and a waste of money.
I frankly would rather not waste that much money, but i am being forced to go