they make me feel like crap, but mostly they make me really angry. When they say these things I just want to show them how wrong they are. You know what? Their opinion doesn't matter, and you shouldn't be around these people because they will drag you down. God made you the way you are for a reason and it is not so everyone can dump on you. Just hold your head high and tell them all to f*** it.
2006-11-23 12:24:05
·
answer #1
·
answered by crzygal 3
·
0⤊
0⤋
Well I don't think it really makes anyone feel especially great about themselves! It diminishes self-esteem and pride. My advice is to stay away from people who hurt you intentionally and tell them to just f*** off! Or you know, something along those lines haha.
2006-11-23 23:37:27
·
answer #2
·
answered by Mickey 1
·
0⤊
0⤋
It used to bother me and then I started to believe it. But one day I finally realized that the people who told me this were people who had been told that themselves,-they're called toxic people and I started to avoid them. When I did, I began to see that there were people who really loved me and I began to believe that, truly, in my heart. Don't let anyone hurt you like that. They are only telling you how they are when they say stuff like that. Keep your chin up.
2006-11-23 23:32:01
·
answer #3
·
answered by Not In Kansas? 3
·
0⤊
0⤋
it makes me feel good because you know that your beautiful.
2006-11-23 21:53:18
·
answer #4
·
answered by ivan_ayarza10 1
·
0⤊
0⤋
it used to bother me but as i got older i don't give a flip...
2006-11-23 20:22:00
·
answer #5
·
answered by va8326 5
·
0⤊
0⤋
i would smack him in the face and say kiss my ***!!!! and this happened to me before and i did say this!!! omg i was so freakin mad at him that i even kicked his *** afterwards!!! it was fun too!!! hehe
2006-11-23 22:36:25
·
answer #6
·
answered by jessica o 1
·
0⤊
0⤋