Let’s load in the packages that we will use today. Note that while we
will be using the terra
package throughout our analysis
today we will not load in the terra package. Instead
when we want to use a function from the terra
package we
will use the double-colon notation, for example,
terra::plot()
allows us to call the plot()
function from the terra
package. The reason we do this is
because a number of the functions in the terra
package has
names that are identical to functions in the
tidyverse
packages. This can create conflicts/issues with
your code and we can avoid these issues by calling terra
functions using the double-colon notation.
library(tidyverse)
library(tmap)
library(sf)
Land surface elevation (topographic) data is frequently
useful/required for environmental and geoscience projects. Obtaining
research-quality topographic data can be somewhat time consuming and
cumbersome, however there are packages that allow you to seamlessly
download this data directly into R. Today you’ll be introduced to the
elevatr
package, which allows you to seamlessly download
elevation rasters for anywhere in the world. This package pulls data
from several, research-quality datasets. More information on the package
and the underlying datasets is available
here
First let’s load in the elevatr
package. Let’s also load
in the tigris
package, which we’ve used in the past. The
tigris
package will be handy for getting borders (e.g., US
state, US county), that we will use to define the area of interest for
which we would like to download elevation rasters.
Note: If you haven’t already done so, then you will need to first
install both the elevatr
and the tigris
packages.
library(elevatr)
library(tigris)
Let’s get the county boundaries for NY state and then use
filter()
to get just Schenectady county.
NY_counties <- counties(state = "NY",
cb = TRUE)
county_schenectady <- NY_counties %>%
filter(NAME == "Schenectady")
Now that we have the area of interest, let’s obtain the elevation
raster for that area. We will use the get_elev_raster()
function from the elevatr
package. The first argument in
the function is a sf object that represents our area of interest (i.e.,
Schenectady county). The second argument z
species the
resolution of the raster data that we would like to download. A value of
1 specifies the lowest resolution and a value of 14 is the highest
resolution available. Note that downloading a very high resolution
raster for a large area will take much longer (and be a much larger file
size). The clip
argument specifies if your want to clip the
raster to the spatial object. By specifying
clip = "locations"
the elevation raster will be clipped to
the boundaries of Schenectady county. FYI, when you run the code you
will see a download progress bar in your console.
raster_elev <- get_elev_raster(county_schenectady,
z = 12,
clip = "locations")
## Mosaicing & Projecting
## Clipping DEM to locations
## Note: Elevation units are in meters.
Let’s quickly plot the data to confirm that we’ve downloaded what we expected.
terra::plot(raster_elev)
Just to highlight how the raster resolution makes a difference, below
I am showing the raster for Schenectady County where z = 6
(low resolution). You can see it has much less detail than the raster
above where z = 12
(high resolution).
Low resolution
Pretty cool - we’ve quickly and seamlessly obtained raster elevation
data for our area of interest. You can try to modify the code above to
test out the get_elev_raster()
function some more and to
try it on different areas.
The get_elev_raster()
can be used for any location
around the world. To demonstrate this let’s obtain raster elevation data
for Lesotho (country in southern Africa).
To obtain high resolution border data we will use the package
rnaturalearth
. This package let’s you easily download a
range of spatial data into R. More info on the package can be found
here. FYI, you will need to install rnaturalearth
if you
haven’t yet installed it.
library(rnaturalearth)
You should also install the rnaturalearthhires
package.
To do this you will need to first run the following line of code in your console.
install.packages("remotes")
Then you should run the following line of code in your console.
remotes::install_github("ropensci/rnaturalearthhires")
Now let’s download the country border for Lesotho
borders_hires <- ne_countries(country = "Lesotho",
scale = "large",
returnclass = "sf")
Following the same approach as we did with Schenectady county, we will now get elevation data for the country of Lesotho
raster_elev_lesotho <- get_elev_raster(borders_hires,
z = 9,
clip = "locations")
## Mosaicing & Projecting
## Clipping DEM to locations
## Note: Elevation units are in meters.
Let’s use tmap
to make a quick map of the topography of
Lesotho
raster_elev_lesotho %>%
tm_shape() +
tm_raster(style = "cont", palette = terrain.colors(n = 10)) +
tm_shape(borders_hires) +
tm_borders()
## stars object downsampled to 1080 by 926 cells. See tm_shape manual (argument raster.downsample)
In addition to elevation data, we frequently need gridded (raster)
climatic/meteorological data for environmental projects. Once again, R
comes to the rescue and allows you to seamlessly query and download a
range of research-quality gridded climatic/meteorological datasets. One
package that is particularly useful in this area is
climateR
. The climateR
package gives you
access to data from a host of datasets including gridMET (US
meteorological data) and TerraClim (global meteorological data). Each of
these datasets has their particular set of variables (e.g.,
precipitation, temperature, barometric pressure,…) and their particular
characteristics (e.g., spatial and temporal resolution, underlying data
used to generated the gridded data product). When conducting research
you would want to be aware of these technical details and you would
choose the best available data for your application. To learn more about
the climateR
package and the data it can query you can go
to the following
website, which provides a general overview.
Once you have the remotes package installed (which you should have
done earlier in today’s lecture), you should then install the
climateR
package by running the following code in
your console.
remotes::install_github("mikejohnson51/climateR")
After the climateR
package has been installed, you can
load it in using the code below.
library(climateR)
Let’s use the ne_states()
function from the
rnaturalearth
package, which will allow us to download
high-resolution borders for the US. You will likely need to install this
package first.
border_NY <- ne_states(country = "United States of America", returnclass = "sf")
border_NY <- border_NY %>%
filter(name == "New York")
gridMET is a gridded meteorological dataset for the United States. gridMET is a high-quality dataset that is widely used in climate and environmental research applications. More details on the data are available here.
Using the climateR
package you can quickly and
efficiently load gridMET data right into R. Let’s first see what
variables are available for download.
catalog %>%
filter(id == "gridmet") %>%
select(varname, units, description)
Let’s go ahead and download the gridded data for the daily minimum air temperature for the state of NY. The gridMET data is available at daily temporal resolution.
To download the data we will use the climateR
package.
Let’s just download one day of data (FYI, you can easily download
multiple days if you want to). Since Feb 4th, 2023 was an extremely cold
day in NY, let’s get data for that day.
to get gridMET data we use the aptly named getGridMET()
function. The first argument (AOI = border_NY
) specifies a
spatial polygon (e.g., border) that defines the area over which to get
the data. The second argument specifies the gridMET variable(s) to
download. The third argument specifies the date for which to download
data for. FYI, to download data over more than one day, you would add
another argument that specifies the endDate
to use.
p <- getGridMET(AOI = border_NY,
varname = c("tmmn"),
startDate = "2023-02-04"
)
p <- p$daily_minimum_temperature
Since the gridMET data is obtained as tiles (large rectangular area that covers your area of interest), we often want to crop and mask the data to keep just the pixels that fall within our specified area. Let’s go ahead and do this.
p <- terra::crop(p, border_NY)
p <- terra::mask(p, border_NY)
## Warning: [mask] CRS do not match
p
## class : SpatRaster
## dimensions : 108, 189, 1 (nrow, ncol, nlyr)
## resolution : 0.04166669, 0.04166669 (x, y)
## extent : -79.74583, -71.87083, 40.50417, 45.00417 (xmin, xmax, ymin, ymax)
## coord. ref. : +proj=longlat +ellps=WGS84 +no_defs
## source(s) : memory
## name : tmmn_2023-02-04
## min value : 234.8
## max value : 261.8
## unit : K
## time : 2023-02-04 UTC
You can see above that the units of temperature are degrees Kelvin. To convert to degrees C we will simply need to subtract 273.15 from the raster.
p <- p - 273.15
Now that we have our data and have cropped/masked it to our area of interest, let’s make a map of the data.
p %>%
tm_shape() +
tm_raster(palette = "Blues", n = 10) +
tm_shape(border_NY) +
tm_borders()
gridMET provides excellent daily meteorological data for the US, however if we want gridded met data for outside of the US we will need to rely on another data product. TerraClim provides two sets of gridded climate data namely, climate normals and climate monthlies.
Climate normals are gridded monthly average values for a given time period (e.g., 1981-2010). Thus the January normal of precipitation would be a gridded raster of average precipitation where the average was taken across all of the Januaries between 1981-2010.
The climate monthlies are gridded monthly data for a given year-month. Thus you could obtain the gridded precipitation data for January 2020 (or February 2020, or March 1995,…).
Let’s go ahead and get the climate normals for precipitation for the country of Lesotho.
First, we will use the ne_countries()
function from the
rnaturalearth
package to get the border of the country of
Lesotho.
borders_hires <- rnaturalearth::ne_countries(country = "Lesotho",
scale = "large",
returnclass = "sf")
The variables available from the TerraClim normals are the following:
catalog %>%
filter(id == "terraclim_normals") %>%
select(varname, units, description)
Now we will use the getTerraClimNormals()
from
climateR
to get the TerraClim precipitation normals. The
AOI
specifies the spatial object that describes the area of
interest. The varname
argument specifies the variable
dowload. The scenario
argument specifies the 30 year normal
period to use. The month
arugment specifies which months to
obtain. Below we specified month = 1:12
so we will download
all twelve months of average precipitation (averaged over the time
period of 1981-2010).
climate_raster <- getTerraClimNormals(
AOI = borders_hires,
varname = "ppt",
scenario = "19812010",
month = 1:12,
verbose = T,
dryrun = FALSE
)
## source: http://thredds.northwestknowledge.net:8080/thredds/dodsC/TER...
## varname(s):
## > ppt [mm] (precipitation_amount)
## ==================================================
## diminsions: 59, 51, 12 (names: lon,lat,time)
## resolution: 0.042, 0.042, 1 month
## extent: 27, 29.46, -30.67, -28.54 (xmin, xmax, ymin, ymax)
## crs: +proj=longlat +a=6378137 +f=0.00335281066474748 +p...
## time: 1961-01-01 to 1961-12-01
## ==================================================
## values: 36,108 (vars*X*Y*T)
Since climateR
returns the data as a list of rasters, we
will run the following code to obtain the raster from the list object.
This just makes the data easier to deal with in the following steps, as
we now have climate_raster
as a raster object.
climate_raster <- climate_raster$ppt
Now let’s crop and mask the raster.
climate_raster <- terra::crop(climate_raster, borders_hires)
climate_raster <- terra::mask(climate_raster, borders_hires)
## Warning: [mask] CRS do not match
And then make a map.
climate_raster %>%
tm_shape() +
tm_raster(palette = "RdBu", n = 10) +
tm_shape(borders_hires) +
tm_borders()
Pretty cool! We have a faceted map showing the monthly average precipitation (for the period of 1981-2010) for the country of Lesotho. You could easily modify the above code to obtain precipitation (or other meteorological data) for a different area of interest.
Recalling what we learned in a past class lesson, we know that we can perform mathematical operations on rasters. It would be useful to sum up the monthly precipitation at each pixel of the raster to obtain the annual average precipitation for each pixel. This would then give us a raster of the annual average precipitation for Lesotho.
Let’s create this new raster by taking the sum across the twelve monthly rasters.
lesotho_tot_prcp <- sum(climate_raster)
Now let’s make a map to confirm that we did what we had expected to
do. Note that within the lesotho_tot_prcp
object, the
default layer name is sum
, and thus we need to refer to
that layer in the code below.
lesotho_tot_prcp$sum %>%
tm_shape() +
tm_raster(palette = "RdBu", n = 10) +
tm_shape(borders_hires) +
tm_borders()
Now let’s use the climateR
package to get climate
monthlies for a given year-month. Let’s obtain the gridded total
precipitation for January 2021 for the country of Lesotho.
climate_raster_monthly <- getTerraClim(AOI = borders_hires,
varname = "ppt",
startDate = "2021-01-01")
climate_raster_monthly <- climate_raster_monthly$ppt
Now let’s crop and mask the raster.
climate_raster_monthly <- terra::crop(climate_raster_monthly, borders_hires)
climate_raster_monthly <- terra::mask(climate_raster_monthly, borders_hires)
## Warning: [mask] CRS do not match
Now let’s make a map of the precipitation data for Jan 2021.
climate_raster_monthly %>%
tm_shape() +
tm_raster(palette = "RdBu", n = 10) +
tm_shape(borders_hires) +
tm_borders()
Explore the elevatr
and climateR
packages.
Try to download additional variables for different areas of interest.
Make some new/modified meteorological and/or topographic maps. Come up
with some interesting questions that you can use these packages to help
you answer.
We will see some additional topics in spatial and raster analysis in the coming lectures, though this is a large topic and we will just scratch the surface in this class. Let me know if you want to try something else but aren’t sure how to move forward. I can also point you to additional data. A great place to quickly find some interesting rasters is at NASA Earth Observation website. You can quickly view rasters and download data for preliminary analysis at this site. The site also provides information and links to the research quality versions of the data if you are interested (though for most initial analysis the data on the link I provide are sufficient).