I have a long-running project Raspberry Pi project to make time-lapse movies of a countryside view (see previous posts). It’s been running for over two years and the results are great. The camera takes images every ten minutes and by assembling movies with different time intervals, you can see seasons change, trees growing or the sun setting and snow melting.
The sky is mesmerising to watch. Here are two movies, one in April 2020 and one in April 2021. The whole month is shown (each frame is ten minutes, daylight hours only).
These movies are fun to watch. There’s something quite soothing about watching the clouds rush by, change direction or clear out of the way.
Here in the UK there’s plenty of cloudy days but also some blue skies in April. If we want to describe a patch of sky, how can we do it?

This is the output from some code I wrote to sample the sky on each day. The approach was to load the image taken at midday, take a section of sky, then average it.
Averaging rgb values
I found this unintuitive but RGB images need to be averaged differently to how I first assumed. In microscopy, an RGB image is usually three independent channels and obtaining the average is as simple as averaging the red, the blue and the green channels independently (summing the pixel values and dividing by the total pixels). The resulting triplet(s) are the average view. For RGB images, each pixel is a description of the location and not the colocation of three signals. How satisfying that is as an explanation I don’t know. But anyway, I needed to use a different approach: take the sum of the squares of each channel’s values and find the square root.
There are examples online to show that this approach (there are alternatives) results in a better description of the pixels being averaged and less dull and washed out than the plain averaging approach.
Using this method, the patch of sky can be described as a single colour and, although we are only looking at one time point, midday is probably a representative time to describe the sky on that day.
It’s striking how many blue sky days we had in March to May 2020 compared with the same period in 2021.
Code
This is how the averaging was done (the plotting was a separate function), most of this is file import stuff, the magic is the part after the image is loaded in the for-loop.
Function GetAverageValues()
// Images are 3280 x 2464
// 1280 wide, 300 tall
// starting at 1000, 50
Variable BaseDate = Date2secs(2019,1,1) + (12 * 60 * 60)
Make/O/N=(365)/T date2019
date2019[] = Secs2Date(BaseDate + (p * 24 * 60 * 60),-2)
Duplicate/O/T date2019, date2020, date2021
date2020[] = ReplaceString("2019-",date2019[p],"2020-")
date2021[] = ReplaceString("2019-",date2019[p],"2021-")
Concatenate/O/T/NP=0/KILL {date2019,date2020,date2021}, theDateW
Variable nDays = numpnts(theDateW)
// make a matrix to hold the excerpt in
Make/O/N=(1280,300,3)/D/FREE excerpt
// wave to hold results
Make/O/N=(nDays,3) resultW
Variable pixels = 1280 * 300
NewPath/O/Q/M="Please find disk folder" diskFolder
if (V_flag != 0)
DoAlert 0, "Disk folder error"
return -1
endif
String fileName
Variable i
for (i = 0; i < nDays; i += 1)
fileName = theDateW[i] + "_1200.jpg"
if(mod(i,30) == 0)
Print fileName
endif
ImageLoad/T=jpeg/Q/N=image/P=diskFolder/Z fileName
WAVE/Z image
if(WaveExists(image))
excerpt[][][] = image[1000 + p][50 + q][r]
MatrixOp/O/FREE rgbValue = sqrt(sumsqr(excerpt) / pixels)
resultW[i][] = rgbValue[0][0][q]
KillWaves/Z image
else
resultW[i][] = NaN
endif
endfor
Make/O/N=(dimsize(resultw,0),dimsize(resultw,1))/U/B rgbW
rgbw[][] = resultw[p][q]
ArraySpots()
End
—
The title for this post comes from “Blue Monday” by New Order. I have ten versions of this track in my library by New Order and a further two cover versions. I’ll go with the version on Substance 1987.