Let's take the coordinates of a rectangle. That would be:
(-x, -y), (-x, y), (x, -y), (x, y)
The opposite corners of a rectangle would be (x, y) and (-x, -y), and the other two corners would be (-x, y) and (x, -y). Use the distance formula with each pair.
d = sqrt( (x2 - x1)^2 + (y2 - y1)^2 )
Using it with (-x, -y) and (x, y), we get
d = sqrt ( (x - (-x))^2 + (y - (-y))^2 )
d = sqrt( (2x)^2 + (2y)^2 )
d = sqrt( 4x^2 + 4y^2 )
d = sqrt(4(x^2 + y^2) )
d = 2sqrt(x^2 + y^2)
Using the distance formula with (-x, y) and (x, -y), we get
d = sqrt ( (x - (-x))^2 + (-y - y)^2 )
d = sqrt( (2x)^2 + (-2y)^2 )
d = sqrt( 4x^2 + 4y^2 )
d = 2sqrt(x^2 + y^2)
As you can see, the distance is equal in both cases, which mean the diagonals of a rectangle are congruent.
2007-04-03 18:00:59
·
answer #1
·
answered by Puggy 7
·
0⤊
0⤋
This can be done in purely Euclidean ways too.
Let the rectangle be ABCD, with AC and BD the diagonals.
Look at the triangles ABC and BAD. They have two sides known the be angle, and the angle between them also known to be equal because it's indeed a right angle.
So that's one congruent-triangles proof.
I'd go with that as one proof, and for the other one use the Pythagorean Theorem to calculate the diagonal lengths explicitly.
2007-04-04 16:08:45
·
answer #2
·
answered by Curt Monash 7
·
0⤊
0⤋
the easiest way is :
let a ,b be the diagonals,l be the length,c be the breadth.
a squared=l squared +c squared by pythagoras thm
b squared=l squared +c squared by pythagoras thm
so a squared=b squared
so a=b
another way is by using trigo since the diagonals bisect the angles
2007-04-04 02:32:04
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
use pythagoras' theorem where the length of the hypotenuse equals the squared of each of the other 2 sides added together
2007-04-04 05:52:12
·
answer #4
·
answered by the sweet escape <3 1
·
0⤊
0⤋
Pythagorean theorem is one way. Another way would be with geometry and the use of angle manipulations.
2007-04-04 00:54:38
·
answer #5
·
answered by Jack 3
·
0⤊
0⤋