An arrow is shot with an initial velocity of 100 m/s at an angle of 50 degrees off the ground. The target is placed on the side of a mountain 10 m off the ground. Assuming the arrow reaches its maximum height. A.How far away from the target should the arrow be shot? B.How long will it take to reach the target?
.
.
.
I m fine, wbu
Answers & Comments
Verified answer
Answer:
see attached solution.
nd I'm also fine.
Answer+Explanation:
A. The arrow should be shot at a distance of approximately 81.8 m from the target. This is calculated by using the equation for the range of a projectile, which is R = (V sin2θ)/g, where V is the initial velocity, θ is the angle of projection, and g is the acceleration due to gravity. Plugging in the given values, we get R = (100 sin2(50))/9.8, which simplifies to 81.8 m.
B. The time it takes for the arrow to reach the target can be calculated using the equation t = (2V sinθ)/g, where V is the initial velocity, θ is the angle of projection, and g is the acceleration due to gravity. Plugging in the given values, we get t = (2(100)sin(50))/9.8, which simplifies to 1.6 s.
Hope it helps ^^.
Ik, me too! State bruh!?