Let’s load in the packages that we will use today. Note that while we
will be using the raster
package throughout our analysis
today we will not load in the raster package. Instead when we want to
use a function from the raster
package we will use the
double-colon notation, for example, raster::plot()
allows
us to call the plot()
function from the raster
package. The reason we do this is because a number of the functions in
the raster
package has names that are identical to
functions in the tidyverse
packages. This can create
conflicts/issues with your code and we can avoid these issues by calling
raster
functions using the double-colon notation.
library(tidyverse)
library(tmap)
library(sf)
Land surface elevation (topographic) data is frequently
useful/required for environmental and geoscience projects. Obtaining
research-quality topographic data can be somewhat time consuming and
cumbersome, however there are packages that allow you to seamlessly
download this data directly into R. Today you’ll be introduced to the
elevatr
package, which allows you to seamlessly download
elevation rasters for anywhere in the world. This package pulls data
from several, research-quality datasets. More information on the package
and the underlying datasets is available
here
First let’s load in the elevatr
package. Let’s also load
in the tigris
package, which we’ve used in the past. The
tigris
package will be handy for getting borders (e.g., US
state, US county), that we will use to define the area of interest for
which we would like to download elevation rasters.
library(elevatr)
library(tigris)
## To enable caching of data, set `options(tigris_use_cache = TRUE)`
## in your R script or .Rprofile.
Let’s get the county boundaries for NY state and then use
filter()
to get just Schenectady county.
NY_counties <- counties(state = "NY",
cb = TRUE)
county_schenectady <- NY_counties %>%
filter(NAME == "Schenectady")
Now that we have the area of interest, let’s obtain the elevation
raster for that area. We will use the get_elev_raster()
function from the elevatr
package. The first argument in
the function is a sf object that represents our area of interest (i.e.,
Schenectady county). The second argument z
species the
resolution of the raster data that we would like to download. A value of
1 specifies the lowest resolution and a value of 14 is the highest
resolution available. Note that downloading a very high resolution
raster for a large area will take much longer (and be a much larger file
size). The clip
argument specifies if your want to clip the
raster to the spatial object. By specifying
clip = "locations"
the elevation raster will be clipped to
the boundaries of Schenectady county. FYI, when you run the code you
will see a download progress bar in your console.
raster_elev <- get_elev_raster(county_schenectady,
z = 12,
clip = "locations")
## Mosaicing & Projecting
## Clipping DEM to locations
## Note: Elevation units are in meters.
Let’s quickly plot the data to confirm that we’ve downloaded what we expected.
raster::plot(raster_elev)
Pretty cool - we’ve quickly and seamlessly obtained raster elevation
data for our area of interest. You can try to modify the code above to
test out the get_elev_raster()
function some more and to
try it on different areas.
The get_elev_raster()
can be used for any location
around the world. To demonstrate this let’s obtain raster elevation data
for the country of Lesotho (located in southern Africa).
To obtain high resolution border data we will use the package
rnaturalearth
. This package let’s you easily download a
range of spatial data into R. More info on the package can be found
here. FYI, you will need to install rnaturalearth
if you
haven’t yet installed it.
library(rnaturalearth)
Now let’s download the country border for Lesotho.
borders_hires <- ne_countries(country = "Lesotho",
scale = "large",
returnclass = "sf")
Following the same approach as we did with Schenectady county, we will now get elevation data for the country of Lesotho.
raster_elev <- get_elev_raster(borders_hires,
z = 9,
clip = "locations")
## Mosaicing & Projecting
## Clipping DEM to locations
## Note: Elevation units are in meters.
Let’s use tmap
to make a quick map of the topography of
Lesotho.
raster_elev %>%
tm_shape() +
tm_raster(style = "cont", palette = terrain.colors(n = 10)) +
tm_shape(borders_hires) +
tm_borders()
## stars object downsampled to 1080 by 926 cells. See tm_shape manual (argument raster.downsample)
In addition to elevation data, we frequently need gridded (raster)
climatic/meteorological data for environmental projects. Once again, R
comes to the rescue and allows you to seamlessly query and download a
range of research-quality gridded climatic/meteorological datasets. One
package that is particularly useful in this area is
climateR
. The climateR
package gives you
access to data from a host of datasets including PRISM (US
meteorological data) and TerraClim (global meteorological data). Each of
these datasets has their particular set of variables (e.g.,
precipitation, temperature, barometric pressure,…) and their particular
characteristics (e.g., spatial and temporal resolution, underlying data
used to generated the gridded data product). When conducting research
you would want to be aware of these technical details and you would
choose the best available data for your application. To learn more about
the climateR
package and the data it can query you can go
to the following
website, which provides a general overview.
In order to download the climateR
package you will first
need to install the remotes
package. Go ahead and install
the remotes
package. Once you’ve installed the
remotes
package you should run the following code in your
Console.
remotes::install_github("mikejohnson51/climateR")
After the climateR
package has been installed, you can
load it in using the code below.
library(climateR)
##
## Attaching package: 'climateR'
## The following object is masked from 'package:readr':
##
## parse_date
Let’s run this code to change a setting that will allow us to
properly query data when using the climateR
package.
sf::sf_use_s2(FALSE)
## Spherical geometry (s2) switched off
Let’s also load in the USAboundaries
package, which will
allow us to download high-resolution borders for the US. You will likely
need to install this package first.
library(USAboundaries)
PRISM is a gridded meteorological dataset for the United States. This dataset is produced and managed by a research group at Oregon State University and is a high-quality dataset that is widely used in climate and environmental research applications. More details on the data are available here.
Using the climateR
package you can quickly and
efficiently load PRISM data right into R. Let’s first see what variables
are available for download.
params %>%
filter(id == "prism_daily") %>%
select(varname, units, description)
Let’s go ahead and download the gridded data for the daily minimum air temperature for the state of NY. The PRISM data is available at daily temporal resolution.
First let’s use the USAboundaries
package to get the NY
state border. We will later use this border to specify the area over
which we would like to download the PRISM data.
border_NY <- us_states(states = "NY")
Ok, now let’s use the climateR
package to get the data.
Let’s just download one day of data (FYI, you can easily download
multiple days if you want to). Since Feb 4th, 2023 was an extremely cold
day in NY, let’s get data for that day.
to get PRISM data we use the aptly named getPRISM()
function. The first argument (AOI = border_NY
) specifies a
spatial polygon (e.g., border) that defines the area over which to get
the data. The second argument specifies the PRISM variable(s) to
download. The third argument specifies the date for which to download
data for. FYI, to download data over more than one day, you would add
another argument that specifies the endDate
to use.
p <- getPRISM(AOI = border_NY,
varname = c('tmin'),
startDate = "2023-02-04")
p <- p$tmin
Since the PRISM data is obtained as tiles (large rectangular area that covers your area of interest), we often want to crop and mask the data to keep just the pixels that fall within our specified area. Let’s go ahead and do this.
p <- raster::crop(p, border_NY)
p <- raster::mask(p, border_NY)
Now that we have our data and have cropped/masked it to our area of interest, let’s make a map of the data.
p %>%
tm_shape() +
tm_raster(palette = "Blues", n = 10) +
tm_shape(border_NY) +
tm_borders()
PRISM provides excellent daily meteorological data for the US, however if we want gridded met data for outside of the US we will need to rely on another data product. TerraClim provides two sets of gridded climate data namely, climate normals and climate monthlies.
Climate normals are gridded monthly average values for a given time period (e.g., 1981-2010). Thus the January normal of precipitation would be a gridded raster of average precipitation where the average was taken across all of the Januaries between 1981-2010.
The climate monthlies are gridded monthly data for a given year-month. Thus you could obtain the gridded precipitation data for January 2020 (or February 2020, or March 1995,…).
Let’s go ahead and get the climate normals for precipitation for the country of Lesotho.
First, we will use the ne_countries()
function from the
rnaturalearth
package to get the border of the country of
Lesotho.
borders_hires <- rnaturalearth::ne_countries(country = "Lesotho",
scale = "large",
returnclass = "sf")
The variables available from the TerraClim normals are the following:
params %>%
filter(id == "terraclim_normals") %>%
select(varname, units, description)
Now we will use the getTerraClimNormals()
from
climateR
to get the TerraClim precipitation normals. The
AOI
specifies the spatial object that describes the area of
interest. The varname
argument specifies the variable
dowload. The scenario
argument specifies the 30 year normal
period to use. The month
arugment specifies which months to
obtain. Below we specified month = 1:12
so we will download
all twelve months of average precipitation (averaged over the time
period of 1981-2010).
climate_raster <- getTerraClimNormals(
AOI = borders_hires,
varname = "ppt",
scenario = "19812010",
month = 1:12,
verbose = FALSE,
dryrun = FALSE
)
Since climateR
returns the data as a list of rasters, we
will run the following code to obtain the raster from the list object.
This just makes the data easier to deal with in the following steps, as
we now have climate_raster
as a raster object.
climate_raster <- climate_raster$ppt_19812010
Now let’s crop and mask the raster.
climate_raster <- raster::crop(climate_raster, borders_hires)
climate_raster <- raster::mask(climate_raster, borders_hires)
And then make a map.
climate_raster %>%
tm_shape() +
tm_raster(palette = "RdBu", n = 10) +
tm_shape(borders_hires) +
tm_borders()
Pretty cool! We have a faceted map showing the monthly average precipitation (for the period of 1981-2010) for the country of Lesotho. You could easily modify the above code to obtain precipitation (or other meteorological data) for a different area of interest.
Recalling what we learned in a past class lesson, we know that we can perform mathematical operations on rasters. It would be useful to sum up the monthly precipitation at each pixel of the raster to obtain the annual average precipitation for each pixel. This would then give us a raster of the annual average precipitation for Lesotho.
Let’s create this new raster by taking the sum across the twelve monthly rasters.
lesotho_tot_prcp <- sum(climate_raster)
Now let’s make a map to confirm that we did what we had expected to
do. Note that within the lesotho_tot_prcp
object, the
default layer name is sum
, and thus we need to refer to
that layer in the code below.
lesotho_tot_prcp$sum %>%
tm_shape() +
tm_raster(palette = "RdBu", n = 10) +
tm_shape(borders_hires) +
tm_borders()
Now let’s use the climateR
package to get climate
monthlies for a given year-month. Let’s obtain the gridded total
precipitation for January 2021 for the country of Lesotho.
climate_raster_monthly <- getTerraClim(AOI = borders_hires,
varname = "ppt",
startDate = "2021-01-01")
climate_raster_monthly <- climate_raster_monthly$ppt_total
Now let’s crop and mask the raster.
climate_raster_monthly <- raster::crop(climate_raster_monthly, borders_hires)
climate_raster_monthly <- raster::mask(climate_raster_monthly, borders_hires)
Now let’s make a map of the precipitation data for Jan 2021.
climate_raster_monthly %>%
tm_shape() +
tm_raster(palette = "RdBu", n = 10) +
tm_shape(borders_hires) +
tm_borders()
Explore the elevatr
and climateR
packages.
Try to download additional variables for different areas of interest.
Make some new/modified meterological and/or topographic maps. Come up
with some interesting questions that you can use these packages to help
you answer.
We will see some additional topics in spatial and raster analysis in the coming lectures, though this is a large topic and we will just scratch the surface in this class. Let me know if you want to try something else but aren’t sure how to move forward. I can also point you to additional data. A great place to quickly find some interesting rasters is at NASA Earth Observation website. You can quickly view rasters and download data for preliminary analysis at this site. The site also provides information and links to the research quality versions of the data if you are interested (though for most initial analysis the data on the link I provide are sufficient).