The forces acting on that ball once it leaves the hand are weight of the ball (the force of gravity) and drag (friction force due to air). [NB: There may be some lift forces as well depending on the kind of pitch thrown. For example, curve balls and sliders invoke some lift with the spins of the ball. But this really complicates the issue; so I'm going to ignore these forces.]
Weight acts downward to speed up the vertical velocity of the ball. Drag acts along the velocity vector of the ball to slow it down in both the vertical and horizontal directions. The issue boils down to whether gravity speeds up the ball enough to offset the drag force that wants to slow it down.
Given that a fast ball travels with only a slight drop in height from the point of release, we can guess that the vertical velocity increase is only slight from v = u + gt; where v is the end vertical velocity, u = 0 the initial vertical velocity off the hand, t is the time from the mound to the catcher about S = 90 feet away, and g is the acceleration due to gravity.
For a V = 90 mph = 132 ft/sec pitch, t is a matter of t = S/V = 90/132 seconds or so; so the vertical velocity just does not have much time to build up. The drop velocity will be about v = gt = 32(.7) = 22 ft/sec by the time the ball reaches the catcher compared to the 132 ft/sec initial velocity. Not a very significant increase.
In fact, g - a, not g alone, will be the rate of increase in the vertical velocity because drag force has a vertical component and vertical acceleration (a) that will pull back on the force of gravity. This means the vertical increase in velocity will be even less than then the 22 fps shown above. Let's look at drag force next.
So what about the drag force D = 1/2 rho Cd A v^2 = ma; where m is the baseball mass and a is its deceleration along the direction of travel. A 90 mph release will give an initial velocity v = 132 feet/sec pitch to travel 90 feet from the mound to the plate. As drag force is proportional to the square of the release velocity, there is significant drag force upon release. This means D/m = a will result in significant deceleration from the moment of release.
And that drag force will continue, but in diminishing amount, as the ball travels to the catcher at slower and slower velocity. Given that the vertical velocity from gravity is low because of the time of travel, we can conclude that the fastest velocity the ball will have is upon release from the pitcher's hand. From then on, the ball is slowing down due to the continous drag on it. There is no bell curve.
2007-09-19 08:25:00
·
answer #1
·
answered by oldprof 7
·
1⤊
0⤋
If air resistance is neglected - a questionable assumption for something traveling as fast and with as little mass as a thrown baseball - the baseball will be traveling fastest when it reaches its lowest point in its trajectory before being interrupted by something else, usually a baseball bat or the catcher's glove.
Under this assumption, the horizontal component of the ball's velocity will be constant from the time the ball leaves the pitcher's hand until it comes in contact with an object (or a person, in the case of a wild pitch) near home plate. The vertical component will be changing at a constant rate downward, having a slight upward component initially, going to zero at the apex of the ball's trajectory, and then getting steadily greater in its downward direction as the ball starts falling toward the ground. The vertical and horizontal components will add in quadrature, meaning that they add as like the legs of a right triangle in the Pythagorean Theorem do to produce the length of the hypotenuse. The vertical component of the velocity added to the much larger horizontal component will have an even smaller effect than it would if the velocity components added directly. The upshot is that, neglecting air resistance, the speed of the ball would vary by no more than a couple of percent over the course of its trajectory.
In practice, the trajectory is usually so flat that the extra speed the ball picks up from the downward component of the velocity is more than offset from the deceleration because of air resistance. If air resistance is taken into account, the ball is traveling fastest just after it leaves the pitcher's hand.
The statistical "bell curve" has nothing to do with the speed change of a thrown baseball as it progresses to the catcher, though it may (or may not) have something to do with the distribution of initial speeds with which the pitcher releases the ball or where the ball crosses home plate.
2007-09-19 08:55:33
·
answer #2
·
answered by devilsadvocate1728 6
·
1⤊
0⤋
too many things to consider.
the ball picks up speed as it 'falls'. an 80mph ball takes about a 1/2 second to travel from pitcher to plate. that adds a vertical component of speed downward = to 1/2 g = .5 * 32 = 16ft/s
The 80mph has an intitial speed of 80*5280/3600 = 117.3 ft/sec. so the 16f/sec change is not insignificant.
but the ball slows down from drag as it travels. and a curve ball or drop pitch add to the perpendicular velocities. so the fastest horizontal speed will be right after the pitcher lets go, but the fastest absolute speed might be in a direction off of the initial axis much closer to the batter.
2007-09-19 08:13:05
·
answer #3
·
answered by Piglet O 6
·
0⤊
0⤋
No bell curve. Just one that starts high, decreases, and takes a nose dive when it hits the catcher's glove.
You're right when you say that it's going the fastest when it leaves the pitcher's hand.
2007-09-19 08:00:25
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋