Hi all, I have a little problem that is mostly solved except one small piece.
I have a line defined in 3D. I want to determine the azimuth of this line relative to the line's start point in the plane defined by the normal (0,0,1) and the point (0,0,0). The plane is the xy plane. The azimuth would be measured from the vector (0,1,0) in the xy plane.
Here is how I get the angle
N is the normal of the plane (0,0,1)
P is the line that I am working with. It has coordinates Psx,Psy,Psz and Pex,Pey,Pez. where Ps refers to the starting point and Pe refers to the end point.
V is a vector in the direction of P. Vx = Pex-Psx;Vy = Pey-Psy;Vz = Pez-Psz
Now, project V onto the xy plane:
U = V - (V.N)*N; where V.N is the dot product
Now we have a vector in the xy plane, we can find the angle of the this vector and the vector M (0,1,0) which defines North.
To determine the angle:
theta = arccos(U.M/(|U|*|M|))
The problem that I am having is that theta will be the acute angle between the two vectors. In other words, if I have a line that should be 270 degrees, the calculation will report back 90 degrees.