-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Closed
Labels
Description
Description
When I want to compute the distance between the moon and an object I get two different value based on which reference I use (the moon of the object)
Expected behavior
It should be exactly the same
Actual behavior
The separation between the moon and the target is different.
Steps to Reproduce
import astropy.units as u
from astropy import coordinates as c
import astropy.time as TIME
##get a position on earth
obs = c.EarthLocation.of_site('Keck Observatory')
##and a time
start_time = '2018-01-23T01:01:05.813'
starttime = TIME.Time(start_time, format='isot', scale='utc')
##get the moon information
moon = c.get_moon(starttime, location=obs)
##and get a target
RA = 97.00
DEC = 4.9
c1 = c.SkyCoord(RA, DEC, frame='fk5', unit='deg')
##and compute the separation
moon_sep = moon.separation(c1)
moon_sep2 = c1.separation(moon)
if you print moon_sep and moon_sep2 you will get 86.05756420797839 and 30.615793847554087.
By the way, if I create a SkyCoord object manually from the moon coordinates:
c2 = c.SkyCoord(moon.ra.value, moon.dec.value, frame='fk5', unit='deg')
sep = c1.separation(c2)
sep2 = c2.separation(c1)
I get: sep = sep2 = 86.05242730221599 which is slightly different from one of the previous value but this time the reference does not matter.
System Details
Linux-5.9.16-1-MANJARO-x86_64-with-glibc2.33
Python 3.9.2 (default, Mar 3 2021, 20:02:32)
[GCC 7.3.0]
Numpy 1.21.2
astropy 4.3.1
Scipy 1.7.1
Matplotlib 3.4.3
Reactions are currently unavailable