Hello,
My sister started to work in the UK 2 weeks ago and her employer has not asked her to sign an employment contract form.
She finds it a bit strange, and her boss tells he will send a letter to her later (the contract?) but it's not urgent and not mandatory.
In France (where I come from), it is mandatory that the employee and the employer sign a employment contract form.
Isn't it the case in the UK ?? Do you think the situation is normal ? Any advice ?
Thanks
2007-11-11
21:45:39
·
4 answers
·
asked by
hector
1
in
Business & Finance
➔ Careers & Employment
➔ Other - Careers & Employment