Convert Latitude and Longitude Coordinates to Country Name in R

Convert latitude and longitude coordinates to country name in R

Thanks for the carefully constructed question.
It required just a couple of line changes to be able to use rworldmap (containing up-to-date countries) see below. I'm not an expert on CRS but I don't think the change I had to make to the proj4string makes any difference. Others might like to comment on that.

This worked for me & gave :

> coords2country(points)
[1] United Kingdom Belgium Germany Austria
[5] Republic of Serbia

All the best,
Andy

library(sp)
library(rworldmap)

# The single argument to this function, points, is a data.frame in which:
# - column 1 contains the longitude in degrees
# - column 2 contains the latitude in degrees
coords2country = function(points)
{
countriesSP <- getMap(resolution='low')
#countriesSP <- getMap(resolution='high') #you could use high res map from rworldxtra if you were concerned about detail

# convert our list of points to a SpatialPoints object

# pointsSP = SpatialPoints(points, proj4string=CRS(" +proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs +towgs84=0,0,0"))

#setting CRS directly to that from rworldmap
pointsSP = SpatialPoints(points, proj4string=CRS(proj4string(countriesSP)))

# use 'over' to get indices of the Polygons object containing each point
indices = over(pointsSP, countriesSP)

# return the ADMIN names of each country
indices$ADMIN
#indices$ISO3 # returns the ISO3 code
#indices$continent # returns the continent (6 continent model)
#indices$REGION # returns the continent (7 continent model)
}

How to get the longitude and latitude coordinates from a city name and country in R?

With the following code I have successfully solved the problem.

library(RJSONIO)
nrow <- nrow(test)
counter <- 1
test$lon[counter] <- 0
test$lat[counter] <- 0
while (counter <= nrow){
CityName <- gsub(' ','%20',test$CityLong[counter]) #remove space for URLs
CountryCode <- test$Country[counter]
url <- paste(
"http://nominatim.openstreetmap.org/search?city="
, CityName
, "&countrycodes="
, CountryCode
, "&limit=9&format=json"
, sep="")
x <- fromJSON(url)
if(is.vector(x)){
test$lon[counter] <- x[[1]]$lon
test$lat[counter] <- x[[1]]$lat
}
counter <- counter + 1
}

As this is calling an external service (openstreetmaps.org) it can take a while for larger datasets. However, you probably only do this once in a while when new cities have been added to the list.

R - Converting Coordinates to Countries? (Large quantity)

Although it's possible to do this with several different online APIs, the free reverse geocoding APIs tend to handle only one point at a time, and a large data frame like yours could take hours to work through.

I would be tempted to use an R package with country info and map the points to countries using sf or sp.

The following function returns a vector of country names given a vector of latitudes and a vector of longitudes:

get_countries <-  function(long, lat)
{
points <- cbind(long, lat)
countriesSP <- rworldmap::getMap(resolution = 'low')
pointsSP = sp::SpatialPoints(points, sp::CRS(sp::proj4string(countriesSP)))
sp::over(pointsSP, countriesSP)$ADMIN
}

So your code could be something like:

library(tidyverse)

meteor.original <- read_csv("../meteorite-landings.csv")

# Get meteor counts per country
meteor <- meteor.original %>%
filter(year >= 860 & year <= 2016) %>%
filter(reclong >= -180 & reclong <= 180 & (reclat != 0 | reclong != 0)) %>%
mutate(latitude = as.numeric(gsub("^\\((.*), .*$", "\\1", GeoLocation)),
longitude = as.numeric(gsub("^.*, (.*)\\)$", "\\1", GeoLocation)),
country = get_countries(longitude, latitude)) %>%
group_by(country) %>%
count()

# Get a world map, left join the meteor counts and plot
sf::st_as_sf(rworldmap::getMap(res = "li")) %>%
rename(country = ADMIN.1) %>%
left_join(meteor, by = "country") %>%
ggplot() +
geom_sf(aes(fill = log(n)), colour = NA) +
scale_fill_viridis_c(na.value = "black",
breaks = log(c(1, 10, 100, 1000, 20000)),
labels = exp,
name = "Meteorite\nCount") +
theme_void()

Sample Image

Latitude Longitude Coordinates to State Code in R

Here are two options, one using sf and one using sp package functions. sf is the more modern (and, here in 2020, recommended) package for analyzing spatial data, but in case it's still useful, I am leaving my original 2012 answer showing how to do this with sp-related functions.


Method 1 (using sf):

library(sf)
library(spData)

## pointsDF: A data.frame whose first column contains longitudes and
## whose second column contains latitudes.
##
## states: An sf MULTIPOLYGON object with 50 states plus DC.
##
## name_col: Name of a column in `states` that supplies the states'
## names.
lonlat_to_state <- function(pointsDF,
states = spData::us_states,
name_col = "NAME") {
## Convert points data.frame to an sf POINTS object
pts <- st_as_sf(pointsDF, coords = 1:2, crs = 4326)

## Transform spatial data to some planar coordinate system
## (e.g. Web Mercator) as required for geometric operations
states <- st_transform(states, crs = 3857)
pts <- st_transform(pts, crs = 3857)

## Find names of state (if any) intersected by each point
state_names <- states[[name_col]]
ii <- as.integer(st_intersects(pts, states))
state_names[ii]
}

## Test the function with points in Wisconsin, Oregon, and France
testPoints <- data.frame(x = c(-90, -120, 0), y = c(44, 44, 44))
lonlat_to_state(testPoints)
## [1] "Wisconsin" "Oregon" NA

If you need higher resolution state boundaries, read in your own vector data as an sf object using sf::st_read() or by some other means. One nice option is to install the rnaturalearth package and use it to load a state vector layer from rnaturalearthhires. Then use the lonlat_to_state() function we just defined as shown here:

library(rnaturalearth)
us_states_ne <- ne_states(country = "United States of America",
returnclass = "sf")
lonlat_to_state(testPoints, states = us_states_ne, name_col = "name")
## [1] "Wisconsin" "Oregon" NA

For very accurate results, you can download a geopackage containing GADM-maintained administrative borders for the United States from this page. Then, load the state boundary data and use them like this:

USA_gadm <- st_read(dsn = "gadm36_USA.gpkg", layer = "gadm36_USA_1")
lonlat_to_state(testPoints, states = USA_gadm, name_col = "NAME_1")
## [1] "Wisconsin" "Oregon" NA

Method 2 (using sp):

Here is a function that takes a data.frame of lat-longs within the lower 48 states, and for each point, returns the state in which it is located.

Most of the function simply prepares the SpatialPoints and SpatialPolygons objects needed by the over() function in the sp package, which does the real heavy lifting of calculating the 'intersection' of points and polygons:

library(sp)
library(maps)
library(maptools)

# The single argument to this function, pointsDF, is a data.frame in which:
# - column 1 contains the longitude in degrees (negative in the US)
# - column 2 contains the latitude in degrees

lonlat_to_state_sp <- function(pointsDF) {
# Prepare SpatialPolygons object with one SpatialPolygon
# per state (plus DC, minus HI & AK)
states <- map('state', fill=TRUE, col="transparent", plot=FALSE)
IDs <- sapply(strsplit(states$names, ":"), function(x) x[1])
states_sp <- map2SpatialPolygons(states, IDs=IDs,
proj4string=CRS("+proj=longlat +datum=WGS84"))

# Convert pointsDF to a SpatialPoints object
pointsSP <- SpatialPoints(pointsDF,
proj4string=CRS("+proj=longlat +datum=WGS84"))

# Use 'over' to get _indices_ of the Polygons object containing each point
indices <- over(pointsSP, states_sp)

# Return the state names of the Polygons object containing each point
stateNames <- sapply(states_sp@polygons, function(x) x@ID)
stateNames[indices]
}

# Test the function using points in Wisconsin and Oregon.
testPoints <- data.frame(x = c(-90, -120), y = c(44, 44))

lonlat_to_state_sp(testPoints)
[1] "wisconsin" "oregon" # IT WORKS

Fastest way to determine COUNTRY from millions of GPS coordinates [R]

There are two similar questions. They are in my comments above. The questions are asking how to get country names from coordinates. Here the OP is asking which is a faster way to do the task.

Based on the posts, we have three options.

  1. to use the custom function in this question;
  2. to use the geonames package; or
  3. to use map.where() in the map package.

The second option needs a bit of setup. So I just tested map.where(). The following is the result. As the OP said, this function is working much faster.

library(maps)
set.seed(111)
data <- data.table(latitude=sample(seq(47,52,by=0.001), 1000000, replace = TRUE),
longitude=sample(seq(8,23,by=0.001), 1000000, replace = TRUE))

system.time(data[, country := map.where(x = longitude, y = latitude)])

# user system elapsed
# 7.20 0.05 7.29

Find in which country each point-coordinate belongs to

Try using map.where()

library(maps)

COORD$Country <- map.where(database="world", COORD$LON, COORD$LAT)

Retrieving latitude/longitude coordinates for cities/countries that have since changed names?

This might not be what you had in mind, but if you use the exact same code with only the city names (and not the countries), at least the two cases that you mentioned (Sarajevo and Leningrad) seem to work fine. You could try to run the function with a modified locations vector including just the city names, and see if you still get errors. Something like this:

(cities <- gsub(',.*', '', locations))

## [1] "Paris" "Sarajevo" "Rome" "Leningrad" "St Petersburg"

cbind(ggmap::geocode(cities, source = 'dsk'), cities)

## lon lat cities
## 1 2.34880 48.85341 Paris
## 2 18.35644 43.84864 Sarajevo
## 3 12.48390 41.89474 Rome
## 4 30.26417 59.89444 Leningrad
## 5 30.26417 59.89444 St Petersburg


Related Topics



Leave a reply



Submit