Yesterday my twitter
feed was stuffed with multiple re-tweets of the Trans Mountain Pipeline
Expansion Proposal (TMEP) Summary of Evidence (SoE) prepared for Vancouver City
Council 27 May 2015 (ref).
I was first directed to the document by Jeff Lee (@SunCivicLee on Twitter). The
following blog post represents my initial impression of the SoE with particular
emphasis on the air quality analysis component.
My initial response,
upon scanning the SoE, was to challenge some of the statements at the front end
of the report. Specifically, the report makes a number of, what I feel are, questionable
but commonly held assumptions about future
of oil sands development/ production and the use of fossil fuels in Canada. The
SoE presents an analysis that suggests that the TMEP is an unnecessary project that
will become a “stranded asset”. I
strongly disagree with that statement and said so on Twitter. In my opinion the
SoE ignores oil sands projects currently operating and/or under construction,
that on their own are sufficient to keep TMEP fully subscribed. As I described in my post On
the economic and environmental folly of trying to “strangle the oil sands”
just because oil prices are low will not stop the production of oil sands
crude. To explain, a company that has sunk multiple billion dollars into an oil
sands project is not going to shut it down simply because it is insufficiently
profitable. The income from those projects is used to pay for the sunk costs on
those projects. Profits are calculated only after the capital and debt costs
are accounted for. If a project stops producing the oil company will still have
to pay the banker for those capital and debt costs but will have to do so
without any income. Imagine that your job only pays you enough to pay the car
bill, the mortgage and groceries but doesn’t pay you enough to put money aside.
Would you just up and quit your job knowing that once you quit you
still had to pay the car bill, the mortgage and groceries, only without a salary?
Dr. Leach from the University of Alberta explains this concept very well in this
Maclean’s article. He also points out that any project with substantial
investment already in the ground will complete that investment rather than
abandon it for exactly the same reason.
In my opinion, the
SoE also makes some faulty assumptions about the future market for fossil fuels
in Canada. As I wrote in my post: “Starting
a Dialogue - Can we really get to a "fossil fuel-free BC?" even
using the sunniest forecasts Canada is not going to be a fossil fuel free country
in the next several decades. Until that blessed day comes we will still have a
need for fossil fuels. As I wrote in that post: given our dependence on fossil
fuels, I would prefer they travel in pipelines and via double-hulled tankers
rather than on trains, barges or tanker trucks.
What really jumped
out at me, however, was later in the report. It was a series of slides
(starting on page 35/42) presenting an air quality analysis created by “Metro Vancouver”.
The slides are incredibly compelling showing how much of Vancouver would be
exposed to dangerously high benzene concentrations in the case of a spill. Based on the models, in the
event of a major spill, most of the City of Vancouver will be exposed to benzene
concentrations ranging from 4,000 µg/m3 to 166,019 µg/m3
while a portion of the central core would potentially be exposed to
concentrations ranging from 166,019 µg/m3 to 2,554,137 µg/m3.
For the non-chemically-inclined the US EPA provides a conversion factor for benzene
from mg/m3 to ppm (ref) of 1 ppm = 3.19
mg/m3. Doing the math, the Metro Vancouver numbers translate to 1.25
ppm – 52 ppm for the City and the high values range from 52 ppm to 800 ppm. Now
these numbers caused my chemist’s antennae to shoot skyward. If correct these
numbers would represent a devastating risk to the City of Vancouver in the case
of a spill. Any regulator seeing these numbers would have to seriously
reconsider the risks of an oil spill. The problem is that these numbers are
completely out of whack with the numbers you see in the academic
literature. Consider this practical
experiment where the US Navy simulated an oil spill and then took measurements
(ref).
The Navy scientists measured benzene concentrations ranging from 7 ppm under a
simulated wharf (a location sheltered on three sides) to below the detection
limit in open areas. In open areas the maximum recorded benzene concentration was 0.4
ppm. Further testing carried out by the Navy and reported in a different paper
(ref)
using ultra-light crude (API Gravity 36.0, see below for an explanation of API Gravity)
and sampled from a mere 2.5 cm above the oil surface resulted in immediate
benzene concentrations ranging from 80.4 ppm to 3.5 ppm. Over the first hour
the benzene concentrations went down to a range from 68 ppm to 23 ppm. The
two-hour time-weighted average was 15.8 ppm at 2.5 cm above the oil slick. The
conclusion of the paper indicated that a crude oil spill with an API Gravity
under 25 (dilbit has an average API Gravity around 21) would be expected to
result in a negligible benzene exposure except under ideal conditions and even
the worst case scenarios came nowhere close to the numbers presented in the
SoE.
To be fair, the SoE
presents the outputs of a modelling exercise and everyone knows that while modelling is a useful way to get data, models
are often overly conservative and do not always accurately reflect real-world
scenarios. That being said, these numbers even jump out of the page for a
modelling exercise. Consider one of the more famous modelling exercises of this
type, one where the scientists conducted a detailed modelling exercise based on
the conditions during the Exxon Valdez spill (ref).
In that report the maximum calculated hourly-average concentration of benzene
was 4.86 ppmv or 0.1% of the maximum benzene concentration reported in the SoE?
So you can understand
my confusion. The numbers presented in the SoE are not even in the same ballpark
as the literature would have us believe. Because the SoE was only a summary
document, I sought the underlying data and was rewarded when the City of
Vancouver generously provided me with a copy of a technical report prepared by Levelton
Consultants Ltd (the Levelton report).
Time for some
conflict of interest info: in my dealings with Levelton in my professional life
(outside this blog) I have found them to be a very competent consulting
company. My company does not currently do modelling of this sort in Canada and
to the best of my knowledge nothing I write hereafter is in conflict of
interest or will help me in my professional or private life. Okay back to our regularly scheduled blog posting.
As I said, Levelton
is a very reputable company and my examination of their report indicates that
they are using state-of-the-art modelling programs? So how did they end up with
numbers so completely out of the mainstream with respect to oil spill benzene concentrations? Well, as
a chemist, I know that modeled benzene vapour concentrations are very
sensitive to the inputs into the model, in particular the initial concentration
of benzene in the originating crude, the time from spill initiation, the
outside air temperature, the spill thickness and the wind speed (ref) . Being an inquisitive
sort, I went looking for the numbers the modellers used to derive their
assumptions and was completely befuddled. The authors reported that they used
data from crudemonitor.ca which I understand to be industry-supported web site,
however when I looked at the data they used the numbers did not jibe? The
report indicates that the data for the modelled crude was for Cold Lake Blend Crude
(a dilbit blend) and looking at the summary data it seems pretty correct. The density, specific
gravity and viscosity presented in Table 2.1 looked pretty standard. But when I
looked at Table 2.2 all I saw was a mess. Rather than using the accepted
concentrations for various components of the Cold Lake Blend they used a feature called
“pseudo-components/surrogate chemicals” which broke the dilbit into “15
Pseudo-Components”. As they put it:
Each of the
pseudo-components was represented by a single surrogate chemical, which was
modelled in CALPUFF and compared directly to corresponding ambient air quality
objectives and/or human health exposure thresholds. The use of surrogate
chemicals is consistent with the approach taken with the Human Health Risk
Assessment (HHRA) conducted by Intrinsik as additional supplemental information
for the Project application, where Intrinsik associated surrogate chemicals
with the pseudo-components modelled by Tetra Tech EBA. A listing of the
speciated components for Cold Lake Blend crude oil has been obtained from the
available crude oil speciation data on www.crudemonitor.ca, and each of these
components was assigned a surrogate chemical and corresponding chemical
properties. In order to prepare a distillation curve for OilWx, the boiling
points of the surrogate chemicals were sorted in ascending order and the cut
percentage of these surrogate chemicals.
To explain, they
took the dilbit mixture and pretended that instead of being made up of
thousands of components that it was made up of only 15. Now I am not entirely
sure how they established (as the helpful City of Vancouver representative did not
have this information) how much of each of the pseudo-compounds to use in the
model but the simplest approach (which I will use here) would be to simply
split the pseudo-components down the middle. If I read Table 2.2 correctly the
middle of the “hexane” cut would appear be at 6.37% and the middle of the “benzene”
cut is at 7.45%. The mean of these two would be 6.91% so presumably everything
from 6.91% to 7.45% was treated as “benzene”. On the other side the mean of the “heptanes”
(9.24%) and “benzene” (7.45) would be 8.34% By my back-of- the-envelope
calculation using this method their cut of “benzene” would therefore represent as
much as 1.4% of the total volume of the spill. Now I know from their definition that my approach is not exactly what they did but the point is the
same. The “benzene” reported in the technical report (and thus the SoE) is not
really the chemical benzene used in all the toxicity testing; it is “pseudo-benzene”.
The funny thing is that if we go back to crudemonitor.ca we find that the five
year average for benzene concentration in Cold Lake Blend is 0.23% +/- 0.03 % so at the very lest every value should be off by at least a factor of 7. The bigger problem with using this “pseudo-benzene” in the subsequent calculations is
that benzene is very much more toxic than the other components in the dilbit “cut”.
The vast majority of the materials in a dilbit blend have substantially less
toxic effects than benzene and are less soluble. So in this report they use the
most rotten apple in the barrel as a representative of the entire barrel?
Moreover, throughout the remainder of the report (and in the SoE) they then
continue to refer to this “pseudo-benzene” as “benzene” and use that surrogate
value in all the calculations for toxicity. But as I have described, it is not “benzene”,
it is several hundred hydrocarbon compounds 99.99% of which have much lower
toxicity effects and are less volatile than benzene. The entire page 35/42 of
the SOE talks about “benzene”, but that is not what the model is talking about.
I don’t think I can repeat this enough, the benzene in this report is not the
benzene known to chemists and toxicologists around the world and the
toxicological calculations and plumes are not those for benzene. How can anyone
be expected to make an informed opinion when the data you are presented has no
relationship with reality. At least it does explain why the numbers presented
differ so much from every other literature value I could uncover in my
research.
Well this blog post
has gone overlong, so I don’t have enough time to continue to critique the
model. I won’t go into how they appear to fail to incorporate the solubility of benzene
in seawater (I think they might ignore it even though some benzene will dissolve
in the sea water) or any of the other areas where I would differ with the
author’s choice of assumptions. To be clear here, any major oil spill (in this
case an unprecedented and incredibly unlikely spill in a harbour in the world since double-hulled tankers were made mandatory) will have negative air quality
issues. If such a huge spill were to occur directly in First or Second Narrow
there would undoubtedly be some risk to the public, but regulators and decision-makers
would be better served by looking at papers that actually model benzene based
on its reported concentrations in dilbit ( or a comparable API crude ref
or ref)
rather than making an untenable assumption that dilbit is only made up of 15 compounds
and that the toxicology should be calculated using only those compound’s
toxicological characteristics as surrogates.
Author's note: To be
completely clear here, I hold the modellers from Levelton in the highest regard and
do not in any way suggest that their work is underhanded. They were
commissioned to do a challenging modelling exercise and the model they use
appears to be a standard one. Moreover, they fully document all their choices
and decisions in their report and include provisos and limitations on the
interpretation of their work. Unfortunately these provisos failed to make it to
the SoE, as often happens when reports are summarized by people who did not write the report themselves. Absent those provisos,
the results reported to council (and subsequently in the press) are completely
misleading.
Crude oils are
described based on their API gravity. API gravity is the standard specific
gravity used by the oil industry. As described in an earlier post (More
on Oil Spills: Some Toxicological Calculations and What if it were Dilbit?)
specific gravity simply refers to the relative density of a liquid versus water.
API gravity, however, is calculated using the specific gravity of a specific oil. To
borrow from a useful web site (ref):
Specific gravity for API calculations is
always determined at 60 degrees Fahrenheit.
API gravity is found as follows:
API gravity = (141.5/Specific Gravity) – 131.5
Though API
values do not have units, they are often referred to as degrees. So the API
gravity of West Texas Intermediate is said to be 39.6 degrees. API gravity
moves inversely to density, which means the denser an oil is, the lower its API
gravity will be. An API of 10 is equivalent to water, which means any oil with
an API above 10 will float on water while any with an API below 10 will sink.
The API
gravity is used to classify oils as light, medium, heavy, or extra heavy. As
the “weight” of an oil is the largest determinant of its market value, API
gravity is exceptionally important. The API values for each “weight” are as
follows:
•Light – API
> 31.1
•Medium –
API between 22.3 and 31.1
•Heavy – API
< 22.3
•Extra Heavy
– API < 10.0
I think you have shown that something is seriously wrong but let me add some confirmation.
ReplyDelete“Doing the math, the Metro Vancouver numbers translate to 1.25 ppm – 52 ppm for the City and the high values range from 52 ppm to 800 ppm. Now these numbers caused my chemist’s antennae to shoot skyward.”
They also raised a flag for my air quality modeling background. I never recall coming up with concentrations that high for any power plant stack I modeled over the years. But wait there’s more. I checked the concentrations inside a coal-fired power plant stack and the biggest number in the hourly sulfur concentrations measured over an entire quarter was 418 ppm.
Seriously, there is no way contaminant concentrations downwind of an oil slick should approach let alone exceed contaminant concentrations inside the stack of a coal-fired power plant, albeit it was burning PRB coal which is relatively low sulfur but still.