0% found this document useful (0 votes)
2 views

Module 5 Script

Module 5 focuses on analyzing Argo data to assess ocean temperature changes near Australia, concluding that the upper 200 meters of the ocean is warming at a rate of 0.54 degrees Celsius per decade. The analysis highlights discrepancies with another study and discusses various limitations affecting the results, such as temporal coverage and measurement differences. The importance of these findings is emphasized in relation to marine ecosystem health and implications for coastal community planning and resource management.

Uploaded by

jude
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Module 5 Script

Module 5 focuses on analyzing Argo data to assess ocean temperature changes near Australia, concluding that the upper 200 meters of the ocean is warming at a rate of 0.54 degrees Celsius per decade. The analysis highlights discrepancies with another study and discusses various limitations affecting the results, such as temporal coverage and measurement differences. The importance of these findings is emphasized in relation to marine ecosystem health and implications for coastal community planning and resource management.

Uploaded by

jude
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Module 5 Script

Module 5: Applying the argoFloats R


Package
Module 5 Overview
Slide 1: Title Slide
In this section, you will use the Argo data to determine if the ocean temperature near Australia is
warming. You will then determine why these studies are important.

M5 Lesson 1: Argo Data Analysis


Slide1: Title Slide
Slide 2: Analysis Using oce
Until this point, just to do a quick recap before we move on to the final section of this workshop, we have
done getIndex() to get all of the available index from the Argo data; we subsetted by both rectangle and
polygon using our help pages to help us along the way; we then moved on to getProfiles() to create a list
of profiles to download, followed by readProfiles() to then read and download those profiles. We then
moved on to our quality control workflow, which was to plot up the quality of the data (in our case we
just looked at one specific ID); then we moved on to showQCTest to show why certain data was flagged
poorly; and then we went on to clean up the data using applyQC(), that said regardless of why it is flagged
poorly, we want to get rid of it so that it’s no longer affecting our future plots or our future calculations.

So now that we have our cleaned up our data in our little polygon around Australia, the goal is to use
Argo data to determine if the upper 200 meters of the ocean near Australia is warming. Then we’re
going to compare that to a study that’s already been completed.

Slide 3: Demonstration
Our first step, in line 61 is to extract the time from all of the Argo profiles. We do this by using an lapply(),
which we discussed when working with BATS data. You’ll notice this time with the lapply(), there is a [1].
This is because in some cases Argo profiles have multiple cycles. This is beyond the scope of this
course. In line 62, we’re simply unlisting the list of times. In line 66 we’re defining our parameter to be
“temperature”, as our goal is look at trends in temperature over time. It’s important to remember that
depending on what type of analysis you do in the future, you could define your parameter to be anything
you’re interested in.

In line 68, we created storage for the output of the for loop. For the first profile, if we wanted to
determine the mean temperature of the upper 200 m, we would do the following:

keep <- which(cycles[[1]][[‘depth’]][,1] < 200) # Determine which depth is less than 200 m
Module 5 Script

param <- cycles[[1]][[parameter]][,1][keep] # Determine which temperature is associated with a depth


of less than 200 m

meanUpperParam <- mean(param, na.rm=TRUE) # The mean temperature, removing NA values

upper[1] <- meanUpperParam # Adding mean temperature into the storage

The for loop prevents us from doing this operation over 5000 times, and instead loops through all of our
profiles. This means, the variable upper, is a list of the mean temperature of the upper 200 m for all of
our Argo profiles in our subset. In line 77 and 78 we’re removing any time or mean upper temperature
that has NA values associated with them. In line 79 we’re determining the chronological time order, and
in line 80 and 81, for good practice, we’re putting our time and mean temperature in chronological time
order. In line 83, we're saying plot the upper mean temperature as a function of time, changing our
symbol type, changing our symbol size, labelling out X and Y axis, and adjusting the colours of our
symbols to make them transparent. Next, we determine the rolling average to remove the noise.

Slide 4: Rolling Average


As a reminder, the rolling or boxcar average removes noise from the data to display overall trends. In
other words, it smooths the data for us.

Slide 5: Demonstration
In our case, we specified n=5, which means 5 points on the right and 5 points on the left of each point
will be averaged to remove noise. We’re then drawing this rolling average line on our plot in line 86. On
line 87, we're looking at the linear regression or linear model of the temperature as a function of time
followed by line 88 drawing the trend on top of our graph.

Now, if we type “trend” in our console, we see that we are given an intercept and a slope.

Slide 6: POSIXct Time in R


And as a reminder, this is an R thing, where POSIXct time is expressed in seconds. And you can type
?POSIXct, again, making use of the help page that gives you this information. It actually tells you it’s the
number of seconds.

Slide 7: Demonstration
So what that tells us again is that this slope is the change in temperature of the upper 200m per second.
To determine the change in temperature per decade, we have to do a slight calculation. First, extract
the slope by doing coef(trend)[[2]], then we multiplied by 86400 because there are 86400 s in a day,
then we multiplied by 365 because there are 365 days in a year, and finally by 10 because there are 10
years in a decade. After this calculation, we determine that the change in the upper 200 m temperature
per decade is 0.54 degrees Celsius.

Slide 8: End of Lesson


Module 5 Script

M5 Lesson 2: Results and Summary


Slide 1: Title Slide
Slide 2: Results
In conclusion, based on our analysis, the mean surface temperature of the upper 200m near Australia
is warming at a rate of about 0.54 degrees Celsius per decade.

Slide 3: Comparing Results


So now the question is, does that agree with the other study that we mentioned? You can see now why
I chose in and around that area of Tasmania. The other study determined that that particular area that
we looked at was increasing at a rate of 0.2 degrees Celsius per decade. So the question is why would
we have received such different information, and such different conclusions.

Slide 4: Limitations
Well of course, just like when we were looking at the BATS data, there are some things to consider with
our analysis to make it more accurate.

The first is the temporal coverage. For example, if one year we had all of our Argo data from the summer
and then the next year it was all in the winter, there could be some biases there. Again, we should
remember that we do not have any control over where Argo floats go, because they follow the current.

The second thing to consider is pressure. For example, if there was an Argo float that only went down
as far as 50m compared to one that went down as far as 200m, then they could bias each other.

There’s also the idea of taking the median versus the mean, so maybe we should have looked at the
median temperature of the upper 200m,

There could also be differences between Argo measurements. We could be looking at data as far back
as the year 2000 so maybe we should be calibrating our data.

There’s also this idea of quality control. Remember that we only looked at the visual of the quality of the
data for one ID, but I mentioned we should have done that for all of the IDs, so we really should go in and
also do our own quality control. We could do different plots to determine if we think something else
should be flagged back.

The other thing to consider is that the study that we are comparing it to is a much longer data set, and
it’s also a different style of study, so they’re done by modelling, whereas we’re looking at actual
measurements.

And then the last thing to consider is the study region. We can't say for sure that we selected this exact
polygon that they were looking at.

Slide 5: Why is this Important?


These types of studies are important because they can cause permanent impacts on marine
ecosystem health, including:
Module 5 Script

§ Depleting kelp forests and seagrass: Temperature is one physical parameter, along with salinity,
waves, currents, and depth that impact the existence of seagrass (Short et. al, 2001). Seagrass
provides key ecological processes such as organic carbon production and export, nutrient
cycling, sediment stabilisation, and enhanced biodiversity (Orth et al. 2006). An increase in
temperature adds additional stress on our seagrass.
§ Poleward migration of marine species: It is well known that marine species thrive in specific
temperatures. Rapid changes in temperatures can be detrimental to our marine species, as
they are beginning poleward migration to access their optimal temperatures. This type of
migration poses risks to marine organisms including heavier populated seas, restriction to tinier
areas, extinction, and increase susceptibility to diseases.
§ Increase in disease rate: An increase in temperature can enhance the rate of disease spread as
well as lengthen the transmission season of the infection (Karvonen et al. 2010). Along with
temperature, many other factors impact increase in disease rate including pollution, harvesting,
and introduced species and it is therefore difficult to determine which parameter has the
biggest effect (Lafferty et al. 2004).

This information can be relevant to inform various decision making and policy related to:

§ Coastal Community Planning


§ Tourism
§ Biodiversity management
§ Aquaculture
§ Fisheries Management

Slide 6: End of Lesson

M5 Lesson 3: Course Recap


Slide 1: Title Slide
Slide 2: Course Recap
In summary, during this workshop we've identified the goal was to take ocean data and turn it into ocean
information. First, we used the oce package to address the first barrier which allowed us to analyze a
variety of different data.

Slide 3: Application
We began by learning the tools of oce and then we applied it to a real-world example using the Bermuda
Atlantic Time Series data. We determined that ocean temperature near Bermuda was increasing at a
rate of about 0.4 degrees Celsius per decade.

Slide 4: Course Recap


We then addressed the next barrier using the argoFloats package which allowed us to have access to
Argo data which provided us with long term, large spatial data.
Module 5 Script

Slide 5: Application
Then we applied our skills to a real-world example and determine that, based on our study,
acknowledging that there were limitations the ocean temperature near Australia was increasing at a
rate of about 0.5 degrees Celsius per decade.

Slide 6: Implications
We mentioned that an ocean temperature increase could have several implications for ocean resource
management, decision making, and policy, mainly for resources impacted by temperature changes, like
fisheries and agriculture, safety and security, and infrastructure.

Slide 7: End of Lesson

You might also like