sensors Article Observing Spring and Fall Phenology in a Deciduous Forest with Aerial Drone Imagery Stephen Klosterman 1,* ID and Andrew D. Richardson 1,2,3 1 Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA; Andrew.Richardson@nau.edu 2 School of Informatics, Computing and Cyber Systems, Northern Arizona University, Flagstaff, AZ 86011, USA 3 Center for Ecosystem Science and Society, Northern Arizona University, Flagstaff, AZ 86011, USA * Correspondence: steve.klosterman@gmail.com Received: 29 September 2017; Accepted: 5 December 2017; Published: 8 December 2017 Abstract: Plant phenology is a sensitive indicator of the effects of global change on terrestrial ecosystems and controls the timing of key ecosystem functions including photosynthesis and transpiration. Aerial drone imagery and photogrammetric techniques promise to advance the study of phenology by enabling the creation of distortion-free orthomosaics of plant canopies at the landscape scale, but with branch-level image resolution. The main goal of this study is to determine the leaf life cycle events corresponding to phenological metrics derived from automated analyses based on color indices calculated from drone imagery. For an oak-dominated, temperate deciduous forest in the northeastern USA, we find that plant area index (PAI) correlates with a canopy greenness index during spring green-up, and a canopy redness index during autumn senescence. Additionally, greenness and redness metrics are significantly correlated with the timing of budburst and leaf expansion on individual trees in spring. However, we note that the specific color index for individual trees must be carefully chosen if new foliage in spring appears red, rather than green—which we observed for some oak trees. In autumn, both decreasing greenness and increasing redness correlate with leaf senescence. Maximum redness indicates the beginning of leaf fall, and the progression of leaf fall correlates with decreasing redness. We also find that cooler air temperature microclimates near a forest edge bordering a wetland advance the onset of senescence. These results demonstrate the use of drones for characterizing the organismic-level variability of phenology in a forested landscape and advance our understanding of which phenophase transitions correspond to color-based metrics derived from digital image analysis. Keywords: phenology; Harvard Forest; leaf color; plant area index; drone; UAV 1. Introduction Phenology, the study of recurrent biological events, has been a focus of plant science for centuries [1]. Phenology responds to interannual and spatial variability in environmental conditions, particularly temperature [2,3], and also mediates key ecosystem functions, including carbon assimilation and evapotranspiration [4–8]. As global environmental change becomes an increasingly public issue, the value of plant phenology as a sensitive indicator of the effects of change has kindled interest in creating and interpreting phenology records, such as those from digital repeat photography. Tower- or building-mounted phenocams, typically located at positions just above the canopy [9–11], preserve a valuable visual record of vegetation phenology in forests and other ecosystems. “Near-surface” methods such as phenocams complement the phenology records of satellite remote sensing, which extensively observe entire landscapes, but at a spatial resolution that typically makes it impossible to discern individual plants [12]. Unlike other near-surface methods Sensors 2017, 17, 2852; doi:10.3390/s17122852 www.mdpi.com/journal/sensors Sensors 2017, 17, 2852 2 of 17 based on radiometric sensors, e.g., [13,14], images from phenocams have the advantage that they can be visually interpreted by direct examination. With phenocam imagery, it is also possible to objectively quantify phenology from time series of vegetation “greenness” or “redness”. Discrete phenophase transition dates can be derived from those time series, using techniques developed for analyzing satellite remote sensing data [15,16]. These methods have been successfully applied to the evaluation of satellite remote sensing phenology products [17,18] and exploration of the connections between ecosystem function, e.g., carbon assimilation, and the greenness metrics of phenology [5,6]. However the oblique angle of tower-mounted cameras has been suggested to result in biased estimations of the timing of canopy maturity, compared to the vertically integrated plant area index (PAI), as well as in situ observations of leaf expansion [19]. Phenocam imagery does make it possible to easily integrate across multiple organisms in the camera field of view and thus characterize landscape-level phenology. But, in mixed-species stands, these estimates may be biased because the organisms located closest to the camera dominate the field of view and hence are over-represented, while more distant organisms are under-represented. Drones, also called UAVs, open up an exciting new possibility for the near-surface remote sensing of plant phenology. With drones, researchers can obtain aerial photography with a similar spatial resolution to tower-mounted phenocams, but at the landscape scale, similar to satellite remote sensing. Compared to traditional aircraft, drones can be operated at a fraction of the cost, making more frequent observations feasible. Additionally, using photogrammetry techniques with drone images facilitates significant advances over tower-mounted cameras. Orthomosaics simulate an undistorted nadir perspective of the canopy, with a consistent spatial resolution over landscapes [20]. Because of this feature, orthomosaics enable the identification and analysis of larger numbers of individual organisms than is typically possible using tower-mounted camera imagery. Previous research has begun to explore the potential of drones for the study of forest phenology. The first study to use aerial drone imagery to explore forest structural and color properties demonstrated that the changing color of leaves at different times of the growing season can be quantified by applying digital image processing techniques to georeferenced orthomosaics [20]. Subsequently, researchers leveraged the high spatial resolution of near-surface aerial drone photography to delineate crowns and study the phenology of individual trees. It was found that the color of individual tree crowns could be used to identify their species, based on knowledge of their expected phenological status during different times of the season [21]. In another study, high temporal frequency aerial images taken during springtime were used to show that the timing of budburst of individual trees, observed in situ, appears to coincide with the beginning of an increase in a canopy greenness metric calculated from the images of these trees [22]. In our previous work, which used an earlier year of imagery from the same study site reported here, we used drone imagery to show how landscape scale variance in phenology could be attributed to plant species, and explored the nature of the spatial variability of phenology at different spatial resolutions [23]. Our goal in this study is to extend previous efforts to determine the leaf life cycle events of trees that correspond to digital image analysis metrics of those same trees in aerial drone imagery. We use imagery of the complete growing season to go beyond budburst and consider leaf expansion in spring, as well as color change and abscission in autumn. We also compare metrics derived from aerial images with the progression of PAI, to interpret color indices with reference to canopy structural development. Finally, we examine how contrasting air temperature regimes at microsites within the study area correlate with spatial variation in landscape phenology, after accounting for spatial variance induced by differences in tree species. These results demonstrate the use of drones for observing the complete seasonal cycle of deciduous canopy development at the landscape scale, with a high enough spatial resolution to discern organism-level phenomena. Sensors 2017, 17, 2852 3 of 17 Sensors 2017, 17, 2852 2. Materials and Methods 3 of 16 2. Materials and Methods 2.1. Study Site W2.1e. ScotunddyuScitteed our study at Harvard Forest in Petersham, MA, USA (42.5377◦ N, 72.1715◦ W). The climaWteeiscotenmdupcetreadteo,uwr istthudaymateaHnaravnanrudaFloprreestciipnitPaettieornshoafm1,1M0 cAm, UaSnAd (m42e.a5n37a7n°nNu,a7l2t.e1m71p5°erWat)u. re of 7.1 ◦CT.hTehcelimstautde yisateremapiesraatem, iwxietdh ademceidanuoanuns-ueavleprrgerceipeintaftoiornesot,f w11i0thcmsoamnde mweoaondaynwnueatllatenmdsp.eDraetucirdeuous trees oinf 7th.1e °sCtu. dTyhearsetuadiyncaluredaeispraedmoimxeidnadnetcliyduroeduso-eavker(gQreueenrcufosrersutb, rwa)itahndsormeed wmoaopdlye (wAectelarnrdusb.rum), as weDrulelbcraiudsmuy)o,eualsslotwrweeelslbiainrsctyhheel(lsoBtuwedtubyliaracrheaal(lBeingecthulaulnadieaelnplesrgeihsda)o,nmiAenimnsiasen)r,tilAcyamrneedrbioceaaeknc(hbQeue(Fcerhacgu(uFsasrgugubrsraag)nradannifddoilrfioeald)ia,m),aaanpndldebb(Allaacccekkr oak (Querocauks(vQeuluertciunsa)v.eluTthinea)s. tTuhdeystaurdeyaairsea2.i8s 2h.8ahianisnizsiezeaannddccoonnttaaiinnssaapppprorxoixmimatealtyel1y90109t0re0estrweeitshwa ith a diamdetiaemr aettebrraetabsrtehasetighheitg≥ht1≥010cmcm(F(Figiguurree11)).. FigurFeig1u. reSt1u.dSytuadryeaaraetaHatarHvaarrvdarFdoFreosretsot non5/52/211//1177 ((DDOOYY11414)1.)L. oLcoatciaotnioonf moficmroiscirteostietme pteemratpuerreature loggelrosgginedrsicinadteicdataesdbalsubelu“eד”×”sysmymbboolsls.. 2.2. Drone Image Acquisition and Processing 2.2. Drone Image Acquisition and Processing We used methods described in an earlier study [23] to obtain and process aerial photography. WBreiefulyse, dwme eutshedodas ddreosncerib(3eDdRinAarndueCaorpliteerr sQtuudady-C[23F]ratomoe,bt3aDinRaonbdotipcrso, cBeesrskealeeryi,alCpAh, oUtoSgAr)aphy. Briefleyq, uwipepuesdewdiathdarConaneo(n3DPoRwAerrsdhuoCt Aop33te0r0 Qcaumaedr-aC(3F5rmammef,il3mDeqRuoibvoalteicnst,fBocearlkleelnegyt,hC2A8 m, UmS,Aap) perqouxi.pped with a16CmanilloionnPpoiwxeelsr)s.hWoteAus3e3d00thceasmamereac(a3m5emramanfidlmimeaqgueisveattlienngtsffoocraallllefnligghthts2, a8nmdmto,oakpppicrtouxr.e1s 6ofmaillion pixelsg)r.aWy reeufesreednctehseqsuaamree(CcaomloerCrahaenckderimclaasgseics,eXtt-irnitge,sGforarnadllRflaipgihdtss,,ManI,dUtSoAok) bpeifcotruereeascohffaligghrta.yFlriegfhetrence squarfceroel(qoCur eocnhlocayrnCgweha,esdcerkopeuerngcdhlliaynsgesvoicen,rytXh-feirviwteeed,aatGhyersardncuodrniRdngiatipsopindrsisn(,dgMalteeIa,sfUsohSuoAtwa)nnbdineefFoviergreuyreewa2ce)he.kIflmdigaughrietns.gwFaeluirgteuhtmatknferneleqaauft ency was raoumgihnilmy uevmersyhufitvteer dspaeyesddoufri1n/1g0s0p0 rsi,nwg iltehafcoonusttaanntdexepvoesruyrewdeuekrindgureiancgh afluigtuhtm. Tnhleeasfamcoelocrolcohrange, depenbadlianngceownatshueswedefaotrhaelrl caocqnudisititiioonnsd(adtaest,esbeschaouwsencoinnsFisitgeuntreco2lo).r Ibmalaagneces iws enreecetsaskareynfaotr aremliaibnliemum shuttdeirgsitpael ecdamoefra1/o1b0se0r0vast,iownisthofcpohnesntaonlotgeyx[p1o1]s.uWreedcuonrdinugcteedacfhligflhitgshatt. mTihde-dsaaym(ebectwoleoernb1a0laan.mce. was used afnodr a3lpl .amcq.)uoinsietiitohnerdcaletaers,obr eecvaeunlsyeocvoenrscaissttednatycso, alonrdbnaelvaenrcdeuirsinngecpeesrsioadrys offovrarreilaibalbelceloduigdictoavl ecra,mera obseravsaetxiopnossuoref pwhaes ncoonlostgaynt[1d1u]r.inWg eflicgohntsd. uFoctreedacflhigimhatsgeartymaciqdu-disaityion(bdeatwte,ewene c1r0eaat.emd.oartnhdop3hopt.oms .) on eitherofcltehaersoturdeyvaernelayuosvinegrcaabsotudta1y0s0, aJPnEdGnpehvoetrosdutarkienng wpietrhioadnsinotfevrvaarlioambleetecrloscurdipct o(Cvaenr,oansHexacpkosure was cDoenvsetlaonptmdeunrtinKgitfl, ighhtttps.:/F/cohrdeka.wchikiiam.caogme/rwyiakci/qCuHisDitKio),n wdaitthe, wthee crPehaotteodScoarnthopphohtootgorsamofmtehterystudy area usosfitnwgaraeb(oAugtis1o0f0t, SJPt.EPGeteprhsboutrogs, tRauksesnia)w, ainthd apnerfinortmerevdaflionmalegteeorrsecferripentc(iCnganinoEnRHDaAcSkIDMeAvGeIlNopEment Kit, hAHttuaprt:ov/Sa/yrdcnhcFdo(Irkne.tswetriDgkraiatapa.hcAo, rHmchu/invwtesiv[k2ili4l/e]C., AHLD, UKS)A, w). iTthhetohrethPohpohtootSoscaunsepdhinottohgisrastmudmyeatrreyasvoafiltawblaerein(tAhegisoft, St. Petersburg, Russia), and performed final georeferencing in ERDAS IMAGINE AutoSync (Intergraph, Huntsville, AL, USA). The orthophotos used in this study are available in the Harvard Forest Data Archive [24]. Sensors 2017, 17, 2852 Sensors 2017, 17, 2852 4 of 17 4 of 16 Figure 2F.ig(uAre–L2.):(AC–lLo)s: eC-luopse-(u3p0 (m30 bmyb3y03m0 m) )ofofaaeerriiaall iimmaaggees sfofroar saelseeclteiocntioofndoatfesdfartoems f2r0o1m5. E2a0c1h5. Each close-upclsohseo-wupssthhoewRsOthIesRuOsIesdusteodatnoaalnyazlyeztewtwoorreeddooaakkttrreeees.sS. oSloidlimdamgeangtaenRtOaIsRwOeIrsewuseerdetuo scaeldcutloatcealculate solid magenta GCC and RCC values (circles symbols) and curves; similar with the dashed cyan ROIs solid ma(tgrieanngtaleGsyCmCbaonlsd.).RImCCagvesalhuaevse (bceiercnleeqsusaylimzebdotlos)haanvedtchue rsvaemse; sbirmighiltanresws i(tshumthoefdRa, sGh,eadndcyBan ROIs (triangledisgyitmal nbuomlsb.)e.rsI)mfoargveissuahliazvaetiobnepeunrpeoqsuesailniztheids ftioguhrea;v(Me ,tNh)e: tshaemreesubltrinigghGtnCCeasnsd(RsuCCmtimoef sRer,ieGs, and B digital n(suymmbboelrs)s;)cuforvrevfiitssuoarliinztaertpioonlatpiounrsp; aonsdesesitnimtahteisd pfihgeunroep;h(aMse,tNra)n:sitthioenrdeastueslt(ivnegrtiGcaCl lCinaesn).dThReCC time series (svyemrtbicoalsl)in; ecsusrhvoewfistsixoarninnutaelrdpaotleastfioornbso; tahnGdCeCsttimimeasteerdiesp, hwehniloepshparisnegtdratnesictiaolcnuldaatetdesfr(ovmerRtiCcCal lines). The vertsiecraielslainreeosnslhyoshwowsinxfaorntnhue atrlede awtieths froedr sbportinhgGleCaCvetsi.mRCeC-sEeOriFeso,ccwurhrielde osnptrhine gsadmaetdesatceafolcrubloatthed from RCC seritreeseas.re only shown for the tree with red spring leaves. RCC-EOF occurred on the same date for both treWese. calculated green chromatic coordinate (GCC) [25] and red chromatic coordinate (RCC) [11] time series from orthophotos using Matlab (R2017a), to account for changes in scene illumination due Wetocadlifcfuerleantceeds ignraetemnocsphhreormic acotincdcitoioonrsdoirntaimtees(GofCdCa)y[b2e5t]waenendflriegdhtsc:hromatic coordinate (RCC) [11] time series from orthophotos using Matlab (GRC2C0=1G7a/()R, t+oGac+cBo)u, nt for changes in scene illumin(1a)tion due to differences in atmospheric conditions or times of day between flights: RCC = R/(R + G + B), (2) where R, G, and B are the mean redG, gCrCee=n,Gan/d(Rbl+ueGd+igiBta)l, numbers, respectively, in regions of interest from image files. Example GCC and RCC time series and additional processing details are (1) available in Figure S1. We note that thReCpChe=noRlo/g(yRs+ignGal+obBs)e,rved in the GCC and RCC time series of (2) vegetation was approximately 10 times greater in amplitude than the noise observed in the analysis where Ro,f tGhe, garnady rBefearreenctehsequmareea, nwhriecdh,agrorseeednu,eatnodfacbtloursesducihgiatsavl anryuimngbiellrusm, irneastpioencctoivnedliyti,onins arnedgions of interestthfreodmiffeimrenatgteimfielseosf. dEayxaomf flpiglehtGs. CWCe alnsdo eRxpClCorteidmoethseerriniedsicaens,dinacldudiintigoGnCaCl+pRrCoCc, eGsRsVinI,gEdxGet,ails are available in Figure S1. We note that the phenology signal observed in the GCC and RCC time series of vegetation was approximately 10 times greater in amplitude than the noise observed in the analysis of the gray reference square, which arose due to factors such as varying illumination conditions and the different times of day of flights. We also explored other indices, including GCC + RCC, GRVI, ExG, and Hue, as described in the caption of Figure S2, but found the most reliable results from GCC and RCC, and used them in our analysis. Sensors 2017, 17, 2852 5 of 17 Color indices were calculated using different regions of interest (ROIs) depending on the goal of the analysis. For comparison between image-based metrics and in situ observations of individual tree phenology, we drew ROIs around the crowns of the trees that we observed, as shown in Figure 2. For comparison to upward photo-based estimates of PAI, we created ROIs representing the canopy projection of the image field of view from the ground perspective. To examine phenology across the landscape, we used a square grid of 10 m ROIs (grid cells). 2.3. Estimating Phenology Dates from Time Series Data We used curve fitting methods detailed in Klosterman et al. [18] to estimate phenology dates from time series data, including GCC and additional data described below. Sigmoid models are commonly used to approximate the seasonal trajectory of vegetation indices and to facilitate the determination of phenological transition dates. The greendown sigmoid is a variant of the standard dual sigmoid model [16]; a key difference is that the greendown sigmoid allows for a linear decrease in greenness over the course of the growing season. We used the greendown sigmoid model to approximate the seasonal trajectory of GCC for each ROI. Phenology dates were calculated from curve fit parameters by finding the dates of extrema in the curvature change rate (CCR) [15]. We used these methods to calculate dates for the start, middle, and end of canopy development in spring (GCC-SOS, GCC-MOS, GCC-EOS), and similar dates for canopy senescence in fall (GCC-SOF, GCC-MOF, GCC-EOF). We estimated uncertainties using the Jacobian matrix of curve fit parameters to generate Monte Carlo ensembles of phenology dates, and calculated the inner 95% confidence interval of these ensembles. To estimate phenology dates from RCC time series, we calculated the dates of crossing the 10th, 50th, and 90th percentiles of a linear interpolation of RCC values, while RCC increased to its spring maximum, to determine RCC-SOS, RCC-MOS, and RCC-EOS. We used the date of the maximum redness value in fall to represent RCC-EOF. We determined uncertainties as the interval between the nearest two observation dates, or for transitions that occurred on an observation date, as the interval between the midpoints of that date and the nearest two dates. In addition to these automated curve fitting and interpolation approaches, we also visually identified the day of the first observable leaves in aerial images of the individual trees discussed below in Section 2.4.1. 2.4. In Situ Measurements We identified eight microsites within the study area based on earlier results, using imagery from a previous year at this site. We had previously found that species composition explained most of the spatial variance in phenology across the Harvard Forest landscape, using a multiple linear regression [23]. However, we identified locations within the study area where regression residuals were relatively large; these were the same areas in regressions using both the 2013 data from the previous study and the 2015 data from the present study. We located microsites in these places, as well as locations where residuals were relatively small, to better understand the drivers of phenology throughout the study area. To provide context for the image-derived phenological transition dates, we made direct observations of phenology on individual trees, and measured canopy-level PAI using digital cover photography. To characterize the microsite environment, we measured air temperature. 2.4.1. Direct Observation of Trees We observed the phenological status of 30 trees spread across the eight microsites within the study area during each drone photography acquisition date (Figure 1, three to five trees per microsite), and one additional date when the drone was not flown due to weather constraints (5/17/15, DOY 137). Similar to a protocol of established observations of tree phenology at Harvard Forest [26], we estimated the percentage of buds burst on trees, and average leaf size (length) as a percentage of mature leaf size. In autumn, we estimated the percentage of leaves that had changed color due to senescence, and that had fallen. From these observations, we created time series using linear interpolation, and found the Sensors 2017, 17, 2852 6 of 17 point in time corresponding to the deciles of phenological status for each phenology event: day of year for 10%, 20%, . . . 90% of budburst, leaf size, leaf color, and leaf fall. We calculated uncertainties in the same way used for RCC time series. We determined the leaf life cycle events corresponding to image-derived metrics of individual tree ROIs by finding the decile of progress in a specific event that had the lowest RMSD with respect to a given metric across the 30 trees under observation. In other words, we compared all deciles (10%, 20%, . . . ) of all observed events (e.g., budburst, leaf size) to each of the image-derived transitions (SOS, MOS, . . . ), and identified the decile that had the minimum RMSD across all trees (e.g., 10% budburst was closest to SOS). We also examined correlations of GCC and RCC values on each date with percentages of progress in each life cycle event on the same date, within and across trees. Since color indices and leaf transitions typically trend in the same direction, for example, increasing GCC and leaf size in spring, an ordinary least squares regression may yield spuriously low coefficient standard errors. Therefore, we used an econometric approach to time series regression: heteroscedasticity and autocorrelation consistent (HAC) regression. HAC regression calculates the variances of regression coefficients (i.e., square of standard error, used to calculate regression p-value) based on the inferred autocorrelation structure of regression residuals; the coefficient estimates themselves are the same as those of ordinary least squares. We used the ‘hac’ function in Matlab, with pre-whitening at a lag of one time step and the quadratic spectral weighting scheme, as we found that these options led to the most conservative estimates of coefficient variances in a sensitivity analysis [27,28]. We used t-tests to see if there were significant differences in regression parameters between trees. We also calculated ordinary least squares regressions of pooled values across all trees. 2.4.2. Digital Cover Photography (DCP) Upward photos were taken with the same model of camera used for drone photography (non-fisheye lens). We took images manually for upward photos, on the same dates we acquired drone photography, using the same image settings (i.e., automatic exposure), except for white balance, which was set to “auto” for DCP [29]. We used a level to aim the camera directly upward and positioned the camera on the posts holding the temperature data loggers (1 m height, below most mid-canopy trees), in the same orientation on each date. The projected camera field of view had an area of 533 m2 in the canopy (at the average Harvard Forest canopy height of 22 m). We estimated PAI from upward photos at each microsite using a freely available DCP software tool obtained from Craig MacFarlane (contact information in references [29,30]). We calculated sigmoid curve fits of PAI values to estimate transition dates as with GCC time series. However, we observed that PAI values were generally stable from June to September, so we used the traditional sigmoid model as opposed to the greendown sigmoid, effectively setting the slope of the summertime greendown to zero. We also directly compared GCC and RCC values with PAI values, by performing HAC regressions at the microsite level, as both PAI and color indices typically trend in the same direction at each microsite. We also performed ordinary least squares regressions of data pooled across microsites for spring and fall, similar to the analysis described in Section 2.4.1. 2.4.3. Air Temperature Measurements and Effects of Microclimate on Phenology Because the microsite locations included places with the largest residuals in a regression of species composition and phenology dates, we suspected that additional factors may contribute to the timing of microsite phenology. Temperature effects on phenology have been widely studied [18,31–33]. Therefore, we recorded air temperatures half hourly from 11 April 2015, approximately one month before the beginning of leaf out, through December 2015, after completion of leaf fall. To do this, we installed a HOBO U23-004 temperature data logger at a 1 m height with an RS-1 radiation shield (Onset Computer Corp, Bourne, MA, USA) at each microsite. To determine the effect of microclimate on phenology, we correlated spring and fall temperatures with residuals from a statistical model that accounted for the effects of species variation and land cover Sensors 2017, 17, 2852 7 of 17 type, but not microclimate, on the aggregated canopy phenology of 10 m grid cells. A 10 m grid was chosen based on previous analysis, taking into account spatial inaccuracies in the orthophotos and the nature of the species data [23]. More specifically, the model used tree species from a map of all woody stems ≥1 cm diameter at breast height [34]. This map covered 89% of the land area monitored in drone imagery. Woody species composition was determined using all species that appeared in at least 10 of 245 grid cells, and had at least a 20% basal area fraction in any one grid cell. The remaining species were lumped into one predictor to ensure that the fractional species composition of each grid cell summed to 1. Any grid cell that had <50% coverage in the aerial images was eliminated from this analysis, removing partially covered grid cells on the edge of the imaged area and ensuring all data points had adequate spatial representation. We used a no-intercept regression since fractional species compositions are mixture components that sum to 1 [35]. We then calculated the average residual from these spring and fall regressions for the three by three windows of grid cells centered on the microsite locations (30 m by 30 m areas), and regressed these microsite-average residuals on the monthly means of daily minimum, mean, and maximum temperatures for April (MOS) and September (MOF), as temperatures preceding phenology events are commonly used to predict the timing of those events [18,31–33]. We note that we used RCC-MOS for grid cells with red spring leaves, following a criterion discussed in the Results Section 3.1, as long as the amplitude in RCC was greater than the range of noise observed in the gray reference square (Figure S1). For all other grid cells, we used GCC-MOS. We used Bonferroni correction to account for making three temperature comparisons with both phenology transition dates. 3. Results 3.1. Choice of Color Index in Spring Time While changes in canopy greenness have commonly been used to track spring budburst and leaf expansion in deciduous forests, we found that in some instances, these processes appeared to be more associated with changes in redness (Figures S2 and S3). For two of the 30 trees we observed in situ, both red oaks, we saw that leaves were various shades of red (pink, orange) in color during leaf expansion (Figure 1, close up in Figure 2D,E, and Figure S4). Additional oak trees, which were not under in situ observation at the microsites, can also be seen to display red spring leaves in Figure 1. We note that leaves higher in the canopy, i.e., those on the top of the crown and most visible from an aerial view, often appeared to be redder than leaves closer to the ground (Figure S4). For trees with red spring leaves, the springtime GCC profiles showed a delayed increase compared to trees with green spring leaves, including nearby conspecifics that we observed to have nearly the same leaf expansion phenology (Figure 2M,N). However, the springtime RCC time series exhibited a marked peak for trees with red spring leaves. Then, as RCC decreased, GCC increased as leaves became greener in late spring (Figure 2F,G). We found that for these trees, the springtime amplitude in RCC (the increase from the dormant value to the spring maximum) was more than 45% of the spring amplitude in GCC (46% and 64% for the two trees), while for all other trees, it was less than 35%. The springtime increase in RCC was closer in time to observed leaf expansion than GCC, as well as a range of other color indices we considered (GCC + RCC, GRVI, ExG, Hue, Figure S2). We also note that the leaves of oak trees with red spring leaves had no apparent color difference in autumn from those of trees with green spring leaves (Figure 2K). 3.2. Leaf Life Cycle Events of Trees: Correspondence to Image Metrics We determined the leaf life cycle events corresponding to image metrics by finding the closest decile of life cycle event (i.e., 10% budburst) to each metric (SOS, MOS, etc.) for the 30 trees under observation. Following our observation that RCC was a better indicator of spring leaf phenology than GCC for trees with red spring leaves, we used RCC metrics for the two trees where the spring amplitude in RCC was greater than 40% of spring amplitude in GCC, and GCC metrics for all other trees. Sensors 2017, 17, 2852 8 of 17 We found that SOS was closest to 10% budburst (RMSD 4.7 days), although the date of 10% budburst was not significantly correlated to SOS across trees (r = −0.04, p > 0.05). This low (and not statistically significant) correlation is likely related to the fact that there was relatively little variability in the timing of budburst across trees: all trees under observation had 10% budburst within a period of about a week (DOY 121-129), while SOS date uncertainties were 11 days on average (inner 95% confidence interval, i.e., ±5.5 days). However, we found that our determinations of the first observable leaves in aerial image orthomosaics were correlated with SOS (r = 0.46, p < 0.05), indicating SOS dates derived from color-based vegetation indices were associated with biologically meaningful phenological transitions at the branch-to-canopy level. We found that MOS was closest to 40% leaf size (RMSD 3.6 days), and EOS was closest to 70% leaf size (RMSD 6.9 days; spring time GCC, budburst, and leaf size observations with associated transition datesSsenhsoorws 2n017i,n17F, 2i8g5u2 re S5 for an example tree). Later spring transitions were more variab9loef 1a6cross trees, with a range of 14 days for 40% leaf size and 18 days for 70% leaf size (Figure 3A,B). Across specielmesa,afxthcimoeluotwrmpoaenrmcdeonlsattategcroevsma(amluveoesrnawgseeprerec=cie0os.r9r,4er,leaadtlel odtraewkeisathpn