Calculate lat/lon of 4 corners of rectangle using Python

Question:

I need to find the latitude and longitude coordinates of the four corners of a rectangle in a Python script, given the center coordinate, length, width, and bearing of the shape. Length and width are in statute miles, but honestly converting those to meters is probably one of the easiest parts. I have some examples of how to use haversine to calculate distance between 2 points, but I’m at a loss on this one. Would anyone be able to at least point me in the right direction?

Picture of rectangle

Update

This is the formula I came up with for my Python script, based on the link that @Mbo provided:

lat2 = asin(sin(lat1)*cos(length2/(_AVG_EARTH_RADIUS_KM)) + cos(lat1) * sin(length2/(_AVG_EARTH_RADIUS_KM)) * cos(bearing1))

lon2 = lon1 + atan2(sin(bearing1) * sin(length2/(_AVG_EARTH_RADIUS_KM)) * cos(lat1), cos(length2/(_AVG_EARTH_RADIUS_KM)) - sin(lat1) * sin(lat2))

Unfortunately the results don’t make sense. I used a center point of 32° N 77° W, length of 20 miles, width 10 miles, and bearing 0 deg and I’m getting the result 0.586599511812, -77.0.

When I plot it out on my mapping application, it tells me that the coordinate for the new point should be 32.14513° N, -77.0° W.

Edit to add: I converted length1 and width1 to kilometers, and converted bearing1 to radians before using in the formulas above.

Asked By: Charlie

||

Answers:

Rectangle on the earth sphere.. this is doubtful thing.

Anyway, look at this page.

Using formula from section Destination point given distance and bearing from start point, calculate two middles at distance width/2 and bearings bearing, bearing + 180.

For every middle point do the same with height/2 and bearing + 90, bearing - 90 to calculate corner points.

(note that width as corner-corner distance will be inexact for such approximation)

Answered By: MBo

With pyproj you have the tooling to make the calculation.

fwd() supports calculating the end point, giving start, bearing, and distance.

You still need basic geometry to calculate the necessary distance/angles.

Answered By: Willem Hendriks

I ended up finding 2 answers.

The first one, from python – Get lat/long given current point, distance and bearing was simple. The Destination point given distance and bearing from start point formula works after all, I just forgot to convert the lat/long for the start point to radians, and then convert the answer back to degrees at the end.

The resulting code looks like:

import math

R = 6378.1 #Radius of the Earth
brng = 1.57 #Bearing is 90 degrees converted to radians.
d = 15 #Distance in km

#lat2  52.20444 - the lat result I'm hoping for
#lon2  0.36056 - the long result I'm hoping for.

lat1 = math.radians(52.20472) #Current lat point converted to radians
lon1 = math.radians(0.14056) #Current long point converted to radians

lat2 = math.asin( math.sin(lat1)*math.cos(d/R) +
     math.cos(lat1)*math.sin(d/R)*math.cos(brng))

lon2 = lon1 + math.atan2(math.sin(brng)*math.sin(d/R)*math.cos(lat1),
             math.cos(d/R)-math.sin(lat1)*math.sin(lat2))

lat2 = math.degrees(lat2)
lon2 = math.degrees(lon2)

print(lat2)
print(lon2)

The second answer I found, from python – calculating a gps coordinate given a point, bearing and distance, uses geopy and is much simpler, so I ended up going with this as my preferred solution:

from geopy import Point
from geopy.distance import distance, VincentyDistance

# given: lat1, lon1, bearing, distMiles
lat2, lon2 = VincentyDistance(miles=distMiles).destination(Point(lat1, lon1), bearing)
Answered By: Charlie