ERA == (earned runs * 9) / (innings pitched)
conventionally to two decimals.
2007-07-20 03:50:52
·
answer #1
·
answered by Chipmaker Authentic 7
·
2⤊
1⤋
You take the earned runs, multiply by nine, and divide by the number of innings pitched. So if a pitcher throws five innings, gives up five, runs, you multiply the five runs by nine. That comes out to 45. You divide that by the five innings pitched. The ERA is 5.
EDIT: How can you give a thumbs down when you provide the right answer? I mean, I'm just askin'.
2007-07-20 10:53:02
·
answer #2
·
answered by Toodeemo 7
·
1⤊
1⤋
ERA = Earned Runs / ( innings pitched/9 )
2007-07-20 10:52:17
·
answer #3
·
answered by smerc72 3
·
0⤊
0⤋
(# earned runs X 9)/(# innings pitched)
for example, if a pitcher gives up 2 earned runs in 6 innings
(2 X 9)/6 = 3.00 ERA
2007-07-20 11:25:21
·
answer #4
·
answered by TheSandMan 5
·
0⤊
0⤋
It depends on what level you are talking. Total number of earned runs times number of innings in a complete regulation game and divided by number of innings pitched.
Pro -- 9 innings. HS and most amateur levels -- 7 innings. Little League -- 6 innings.
2007-07-20 17:08:45
·
answer #5
·
answered by david w 6
·
0⤊
0⤋
ER divided by innings pitched times 9.
2007-07-20 10:57:48
·
answer #6
·
answered by red4tribe 6
·
0⤊
0⤋
You multiply earned runs by 9, and divide by innings pitched.
Say Roger Clemens pitched 74 innings, and gave up 13 earned runs.
(13x9=117) >>> 117/74 = 1.58 ERA
Make sense?
2007-07-20 10:55:14
·
answer #7
·
answered by Anonymous
·
1⤊
1⤋
(Earned Runs/Innings Pitched) x 9
2007-07-20 10:52:21
·
answer #8
·
answered by rognbiz 1
·
1⤊
1⤋