English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

9 answers

Never. There is no law that requires an employer to give employees a raise. However, good employers normally give you a raise each year.

2007-01-17 07:58:53 · answer #1 · answered by ninecoronas2000 5 · 0 2

In the USA, you are not entitled to a pay raise, unless you are making minimum wage, and minimum wage goes up.

2007-01-17 07:59:06 · answer #2 · answered by Feeling Mutual 7 · 1 0

Employers don't have to give raises or paid sickleave, or paid vacations, or time off to take your kids to the doctor. They just have to maintain the current minimum wage. If you can pick and choose where you work, you'd better check these things out before your hire on.

2007-01-17 08:00:54 · answer #3 · answered by abfmilan 1 · 0 2

No one is automatically entitled to a pay raise; you have to earn it by becoming more valuable to your employer.

2007-01-17 07:58:55 · answer #4 · answered by Anonymous · 0 2

BBB states every 3 months

2015-04-19 15:03:55 · answer #5 · answered by Terrence 1 · 0 0

you have to earn the raise. there is nothing that says an employer has to give you a raise.

2007-01-17 08:02:00 · answer #6 · answered by watch_out 3 · 0 2

Your employer usually has a pay rate schedule. Ask them.

2007-01-17 07:59:55 · answer #7 · answered by Eva 5 · 0 2

Once a year.

2007-01-17 08:01:00 · answer #8 · answered by Anonymous · 0 2

mine every 90 days.

2007-01-17 08:02:18 · answer #9 · answered by Anonymous · 0 1

fedest.com, questions and answers