In answer to your question, It is your employee responsibility to teach you things that has to do work your work. There are work ethics your most follow. There are many job rules at any job site or business. That all employee must follow. You learn the BASIC from home. The work place is totally different.
2007-01-22 04:17:33
·
answer #1
·
answered by lydiamygirl 1
·
0⤊
0⤋
It is more than o.k.. An employer is paying for a service to be performed. If a child can't be responsible enough to perform it than the employer has every right to step in and decide if he wants to teach it or let the child go. A lack of responsibility goes a long way in holding someone back.
2007-01-22 12:18:45
·
answer #2
·
answered by Connie D 2
·
0⤊
0⤋
It is very difficult to teach responsibility at work if there has not been some responsibility learned at home. There isn't anything wrong with trying though...just have to be careful not to step on the toes of the parents.
2007-01-22 16:34:15
·
answer #3
·
answered by dai_nite 3
·
0⤊
0⤋
Yes, it is absolutely necessary for an employer to teach responsibility. Most parents nowadays don't do it, so somebody else has to!!
(and I am in my 20s)
2007-01-22 12:14:38
·
answer #4
·
answered by betatesterwood 3
·
0⤊
0⤋
it's bad but it;s true parents now days don't teach responsibilities to they kids and for that we as an employer have to teach them some responsibilities at work..
2007-01-22 14:34:59
·
answer #5
·
answered by ~Vale~ 2
·
0⤊
0⤋
Well when is comes to A job things have to be done
A certain way, so yeah we all have to be taught on
A job how things to do certain things which boils out
to be responsibility. But yea it would help out alot if
parents would incourage this.
2007-01-22 12:16:39
·
answer #6
·
answered by koko 6
·
0⤊
0⤋