[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Orekit Developers] FieldOfView Detector




Evan Ward <evan.ward@nrl.navy.mil> a écrit :

On 01/19/2016 09:01 AM, MAISONOBE Luc wrote:

Evan Ward <evan.ward@nrl.navy.mil> a écrit :

On 01/18/2016 09:27 AM, Luc Maisonobe wrote:
Le 18/01/2016 15:16, Luc Maisonobe a écrit :
Hi Evan,

Le 15/01/2016 09:25, Luc Maisonobe a écrit :
Le 14/01/2016 22:11, Evan Ward a écrit :
Hi Luc,
Hi Evan,

I noticed your recent work on the new field of view classes. I think
these are a great addition. Would it be possible to model an antenna
attached to a ground station? I think the current implementation of
FieldOfView requires a spacecraft state.
You caught me red handed!

I am also not really comfortable with the spacecraft state in the
offsetFromBoundary method. It induces a dependency that is not
right. It is currently used to convert the input position into
a line of sigth, and this computation should better be performed
at call site, not within the method.

I'll fix this in a few minutes.
I think I will reintroduce a method dedicated to the case the Field Of
View is spacecraft centered. It will be something like:

 List<GeodeticPoint> getFootprint(SpacecraftState state,
                                  OneAxisEllipsoid body);

I think it will be better here than for example in
FootPrintOverlapDetector because there is no predefined geographic
zone
here and the method is expected to be called typically from a
StepHandler throughout propagation.

Do you agree with this new method?
Also if we add this method, do you think FieldOfView should remain in
the events package? Perhaps it should be moved to utils (but utils
is already quite crowded an unstructured).

best regards,
Luc

I presume the getFootprint method would project a region of the
satellite's celestial sphere to the Earth and then return a set of
points along that boundary.

Yes.

I think it would be a very useful method to
have and I can see the logic in keeping it where it is. As for the
parameters, if you're using the ellipsoidal Earth assumption to make the
computation quicker then I think OneAxisEllipsoid is the right type. On
the other hand if the algorithm is using ray tracing or some similar
algorithm then I think using a BodyShape would add flexibility so users
could include the effects of terrain by providing and terrain based
BodyShape.

I use ellipsoidal body assumption, and I do take into account its
flatness. For the parts of the fov boundary that will really hit the
ground, it is in fact some ray tracing and it calls the general
getIntersectionPoint method defined in the BodyShape interface.
However, this cannot be done if parts of the fov skim over horizon,
as we need here to use the points of the limb to wrap around the
visible area on ground. This is where the ellipsoidal body assumption
is used. There is also a special degenerate case if the fov is large
enough it contains all the Earth and none of the boundary points
mett the ground. Here, we return the full limb.

As per your previous message about ground antenna pattern,
I will change the signature to:


    List<List<GeodeticPoint>> getFootprint(Transform fovToBody,
OneAxisEllipsoid body,
                                           double angularStep)

With this javadoc:

    /** Get the footprint of the field Of View on ground.
     * <p>
     * This method assumes the Field Of View is centered on some carrier,
     * which will typically be a spacecraft or a ground station antenna.
     * The points in the footprint boundary loops are all at altitude
zero
     * with respect to the ellipsoid, they correspond either to
projection
     * on ground of the edges of the Field Of View, or to points on
the body
     * limb if the Field Of View goes past horizon. The points on the
limb
     * see the carrier origin at zero elevation. If the Field Of View
is so
     * large it contains entirely the body, all points will correspond to
     * points at limb. If the Field Of View looks away from body, the
     * boundary loops will be an empty list. The points within footprint
     * the loops are sorted in trigonometric order as seen from the
carrier.
     * This implies that someone traveling on ground from one point to
the
     * next one will have the points visible from the carrier on his left
     * hand side, and the points not visible from the carrier on his
right
     * hand side.
     * </p>
     * <p>
     * If the carrier is a spacecraft, then the {@code fovToBody}
transform
     * can be computed from a {@link
org.orekit.propagation.SpacecraftState}
     * as follows:
     * </p>
     * <pre>
     * Transform inertToBody =
state.getFrame().getTransformTo(body.getBodyFrame(), state.getDate());
     * Transform fovToBody   = new Transform(state.getDate(),
     *
state.toTransform().getInverse(),
     *                                       inertToBody);
     * </pre>
     * <p>
     * If the carrier is a ground station, located using a topocentric
frame
     * and managing its pointing direction using a transform between the
     * dish frame and the topocentric frame, then the {@code
fovToBody} transform
     * can be computed as follows:
     * </p>
     * <pre>
     * Transform topoToBody =
topocentricFrame.getTransformTo(body.getBodyFrame(), date);
     * Transform topoToDish = ...
     * Transform fovToBody = new Transform(date,
     *                                     topoToDish.getInverse(),
     *                                     topoToBody);
     * </pre>
     * <p>
     * Only the raw zone is used, the angular margin is ignored here.
     * </p>
     * @param fovToBody transform between the frame in which the Field
Of View
     * is defined and body frame.
     * @param body body surface the Field Of View will be projected on
     * @param angularStep step used for boundary loops sampling (radians)
     * @return list footprint boundary loops (there may be several
independent
     * loops if the Field Of View shape is complex)
     * @throws OrekitException if some frame conversion fails or if
carrier is
     * below body surface
     */

So you will also be able to use this method for your ground station
(or an airborne antenna, or anything like that).

Looks good to me. I think supporting an airborne sensor is a good
additional use case, as you point out.

The method is available. It still needs some polishing as one
test case does not work, but it seems to give very good results
apart when the sensor if properly configured.

You can update the javadoc if you want.

best regards,
Luc


Regards,
Evan


best regards,
Luc


Best Regards,
Evan


best regards,
Luc

I did not foresee using this for ground stations! Once the method
signature is fixed, it may be used this way too. Would you mind
updating the javadoc to explain this other use? I don't know
either if the name FieldOfView is well suited in this case, so
maybe it
should be renamed at the same time.

best regards,
Luc

Best Regards,
Evan