English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

so everyone always says how college changes you. Yes or No ? If so how ?

2007-07-18 05:10:27 · 8 answers · asked by §å§k¡ . 1 in Education & Reference Higher Education (University +)

8 answers

College does not change you. You change yourself. You are in a new mode of your life when you leave home whether or not you go to college.

Your hormones are shifting, your mind exploring and expanding from a whole new set of inputs that you don't get living at home, surrounded by a routine and the same surroundings day in and day out.

You will find that the way you think will not be instilled any longer by your parents, but what your parents have given to you will become the foundation for finding your true self. There are a million forks in the road ahead for you and which way you go is your choice. Reflect before taking a road that you think will lead you in a direction you may not want to go. Trust your gut feeling if something does not feel right and go the other way. The intuition of a woman is strong, trust that. (I'm assuming you are female). It's ok to say no to failure and say yes to success. Your creed can be that failure is not an option.

Wonderful things await. I hope you find the most beautiful experiences in your life and appreciate them to their fullest.

2007-07-18 05:21:53 · answer #1 · answered by mim 6 · 2 0

If it doesn't, you're doing something wrong.

You go to college to learn not only about academic subjects, but about life. As you learn, you grow; as you grow, you inevitably change a little bit. You may try things you thought you would never try, discover things you never knew existed, become friends with people you never dreamed you'd even speak to. You learn to see the big picture, examine issues critically, and relate to people in a new way. Just as the person you are at 18 is different from the person you were at 12, so too will the person you will be at 22 be different from the person you are now.

It's up to you, of course, whether that change is for the better, or the worse.

2007-07-18 05:19:36 · answer #2 · answered by teresathegreat 7 · 0 0

Most definitely but in the best way possible. College makes you a more well-rounded and knowledgable person. You meet new people, learn new things, and have experiences that last a lifetime (sometimes bad but mostly good). College has been a great experience for me so far and I'm looking at some leadership classes to help improve my leadership skills which I'm going to need if I want to do advertising. In the end it's all worth it :)

2007-07-18 07:17:38 · answer #3 · answered by Anonymous · 0 0

Each level of the educational ring you go around is geared to change you. Not just college. As you mature and are learning the issues of life and how you fit into the society, the community in which you live, you will be changed. You will learn how your opinions and feelings 'fit' in the world. You will learn what is important to you and what you can toss out the window. You are stretching - growing - bending-yielding-finding out more about you and others. Your ideas and beliefs are being fashioned and your attitudes about issues are formulated. Life is taking hold of you and whether you are an extrovert or introvert, you are learning who you are. I loved college. Not only was it preparation for my life's work, but it taught me how I would respond in the world that I live in. I learned to reach out and travel and explore, take chances, risks. I learned how to get along with folks whom I would eventually be working with as adults: teachers, workers in the world.

2007-07-18 05:19:27 · answer #4 · answered by THE SINGER 7 · 0 0

I don't think it changes you. I think you are still maturing and finding out who you are when you go to college. Up until then you live in a very controlled environment. College is the first opportunity you really have to be yourself and discover who you really are. This isn't a process that ends in college, it's one that continues your entire life.

2007-07-18 05:18:43 · answer #5 · answered by Scott A 1 · 0 0

You're still young when you enter college (usually), so you still have a lot of growing up to do. You change naturally at that age because of that.

Also, if you stay at college, it gives you the opportunity to live on your own and to make entirely your own decisions. You'll change with that responsibility and with that experience.

For the better though.

2007-07-18 05:13:14 · answer #6 · answered by misscarinne 4 · 0 0

You learn so much about yourself. Generally, you move away when you attend college, and you are on your own. Your parents are not around to make decisions for you, etc. You mature so much during your college years.

2007-07-18 05:15:07 · answer #7 · answered by Serenity 3 · 0 0

absolutely...or it should. you shouldn't pay 4 years to stuff your head with knowledge and come back home the same way you left...you wasted your money then!
Mostly it helps you expand your horizons...meet different people, make your own decisions, define your identity...

2007-07-18 05:37:12 · answer #8 · answered by coquinegra 5 · 0 0

fedest.com, questions and answers