Dimitri Masson
  • About
  • Research

On this page

  • Unknown Warming: A joy division inspired plot of Temperature Anomalies
  • Technical details
    • Parsing the data
    • Data correction
    • Grouping the anomalies by year
    • Smoothing the Data with a Gaussian Blur

Unknown Warming: A joy division inspired plot of Temperature Anomalies

visualization of the temperature anomalies in the form of a Joy Division plot.

Authors

Dimitri Masson

Yann Girard

Published

February 3, 2025

Abstract

My friend Yann showed me a cool graph (Poynting & Rivault, 2024) he wanted to replicate, where temperature anomalies are plotted by frequency rather than chronologically, resembling a Joy Division plot (Inspired by the album cover of Unknown Pleasures by Joy Division), the graph is created by counting the number of days each anomaly occurs per year and offsetting the resulting curve for each year.

Two transformation are applied to the raw data (C3S, 2018) to enhance its appearance. First anomalies are corrected (ReesCatOphuls, 2024) to correspond to the 1850-1900 Baseline, second a Gaussian blur is applied to smooth out the graph.

You can play with the parameters to see how the plot changes. The opacity, the scale of the years, the smooth factor, the color, and the option to start the plot from the bottom or to display only the anomalies.

Below are some technical details on how the data is processed and the plot is generated.

Unknown Warming: A joy division inspired plot of Temperature Anomalies

  • Plot
  • table
// Plot using Plot.plot a area chart for each year
chart = Plot.plot({
  marks: [
    Plot.areaY(
        joyData, 
        { x: "anomaly"
        , y1: !fromBottom ? 0 : "reference"
        , y2: "height"
        , z: "year"
        , axisY: null
        , fillOpacity: opacity
        , stroke: "white"
        , fill: color  }),

    Plot.axisY(Array(2025-1940).fill(1).map((e,i)=>(i+1)*yearOffset), {
      label: "Year",
      grid: true,
      text: Array(2025-1940).fill(1).map((e,i)=>(""+(2024-i))),
      domain: [maxYear, minYear]
    }),
    Plot.axisX({
      label: "Temperature Anomaly",
      interval: 0.5
    }),
    // showZero? Plot.ruleX([0], {stroke: "red", strokeOpacity: 0.5}):null,
  ],
  x: {label: "Temperature Anomaly"},
  y: {label: "year"},
  
  width: 800,
  height: 1200,
  padding: 0.1,
  fill: "black",
  title: "Joy Division Plot of Temperature Anomalies",
})
Figure 1: Joy Division Plot of Temperature Anomalies
viewof opacity = Inputs.range([0,1], {step : 0.1, value: 0.9, label: "Opacity"})
viewof yearOffset = Inputs.range([1,8], {step : 1, value: 2, label: "Y Scale"})
viewof smoothFactor = Inputs.range([0,4], {step : 1, value: 3, label: "Smooth Factor"})
viewof fromBottom = Inputs.toggle( {label: "wave only", value: false})
viewof color = Inputs.color( {label: "Color", value: "#004055"})

DOM.download(() => serialize(chart.children[1]), undefined, "Download the SVG")
temp = t => Math.round(t * 100) / 100 +"°C";
bidimensionalTable = joyData.reduce( (anomalies, d) => {
  
  if (d.year in anomalies) {
    anomalies[d.year][ temp( d.anomaly) ] =d.count
  } else {
    anomalies[d.year] = {year: d.year} 
    anomalies[d.year][temp( d.anomaly) ] = d.count
  }
  return anomalies
}, []).filter( d => d.year >= 1940)
Inputs.table(bidimensionalTable)

Technical details

In this section, we will provide some technical details on how the data is processed and the plot is generated. The process is roughly divided into three steps: 1. parsing the data the raw data from Copernicus (C3S, 2018) 2. Correcting the anomalies to correspond to the 1850-1900 Baseline 3. Counting the anomalies per year 4. Applying a Gaussian blur to smooth out the graph 5. Offset the curve for each year to produce the Joy Division effect

Parsing the data

We start by downloading the daily global mean near-surface (2m) air temperature from 1940-2024 from the ERA5 data (C3S (2018)) from the copernicus website. You can access the values used in this visualisation here. One of the tricky thing is that the CSV files contains comments lines (starting with #). D3 doesn’t handle that by default and we need to remove before parsing the data.

rawData = FileAttachment("./era5_daily_series_2t_global.csv").text().then(processCSV);

// Function to preprocess and parse the CSV
function processCSV(content) {
  // Remove comment lines (lines starting with #) 
  const lines = content.split("\n");
  const filteredLines = lines.filter(line => !line.trim().startsWith("#"));
  const csvContent = filteredLines.join("\n");

  // Parse the CSV content
  return d3.csvParse(csvContent, d3.autoType).map( 
    d => ({ year: d.date.getFullYear()
          , day : d3.timeDay.count(d3.timeYear(d.date), d.date) + 1
          , anomaly: d["ano_91-20"] })
  )
}

maxYear = d3.max(rawData, d => d.year)
minYear = d3.min(rawData, d => d.year)

// Display the parsed data
Inputs.table(rawData)
1
Filter out the comment lines from the CSV content
2
Anomalies will be grouped by year
3
The correction formula (Equation 1) needs the day of the year
4
The raw data contains the anomaly relative to the 1991-2020 average
5
Min and max year are kept for the axis of the plot
(a)
(b)
(c)
(d)
(e)
Figure 2: Raw Data

Data correction

The anomalies in the dataset are computed on the basis of the 1991-2020 average. The current expectation in term of global warming (Paris,… ) is to compare to the pre-industrial era. ReesCatOphuls (2024) has proposed 3 methods to correct the anomalies to correspond to the 1850-1900 Baseline. In this visualisation we retained the second method (Equation 1), that offset the temperature by about \(0.88°C\).

\[ \text{corrected} = \text{anomaly} + 0.88°C + 0.05°C \sin\left(\frac{2\pi \times (day - 0.5)}{days}\right) + 0.07°C \cos\left(\frac{2\pi \times (day - 0.5)}{days}\right) \tag{1}\]

You can see in Figure 3 what the correction is applied depending of the day of the year

correctionData = d3.timeDay
  .range(new Date(2024, 0, 1, 12), new Date(2025, 0, 1), 1)
  .map( (e,i)=> (
    { date : e
    , year: 2024
    , day: i + 1 
    , correction: correction({year: 2024, day: i + 1})
    }
  )) 

// Line plot of the correction by day of the year
Plot.plot(
  { marks: 
    [ Plot.dot(correctionData, 
      { x: "date"
      , y: "correction"
      , fill: "black"
      , r: 1 
      })
    , Plot.ruleY([.88], 
      {stroke: "red"
      , strokeOpacity: 0.5
      })
    ]
  , x: {label: "Day of the year"}
  , y: {label: "Correction"}
  , width: 800
  , height: 400
  , padding: 0.1
  })
(a)
(b)
Figure 3: Visualisation of the temperature correction following the method 2 (Equation 1)
correction = ({year, day}) => {
    const days =  !(year % 4) ? 366 : 365
    const correction = 0.88 + 0.05 * Math.sin((2*3.14159 * (day - 0.5)) / days)
                            + 0.07 * Math.cos((2*3.14159 * (day - 0.5)) / days) 
    return Math.floor( 100 * correction ) / 100
}

correctedData = rawData.map( 
    d => (
        { year:  d.year
        , anomaly: d.anomaly
        , corrected : d.anomaly + correction(d) } ))

// Keep the min and max for discretization later on
offset = 0.1
minAnomaly = d3.min(correctedData, d => d.corrected) - offset
maxAnomaly = d3.max(correctedData, d => d.corrected) + offset
Inputs.table(correctedData)
1
Checking if the year is a leap year, straight forward calculation between 1940 and 2024
2
Formula 2 from ReesCatOphuls (2024) is used correct the anomalies to 1850-1900 baseline temperature
3
The offset is used in the discretization of the anomalies to get an empty bin for the first and last interval
(a)
(b)
(c)
(d)
(e)
(f)
Figure 4: Corrected Data

Grouping the anomalies by year

For each year we count the number of anomalies for each temperature anomaly. The anomalies are discretized into 0.1 intervals.

length = (maxAnomaly - minAnomaly) * 100
indexF = (x) => Math.round((x - minAnomaly) * 100)
indexF_1 = (x) => (x / 100) + minAnomaly

anomalies = correctedData.reduce((anomalies, d) => {
    if (d.year in anomalies) {
      anomalies[d.year][indexF(d.corrected)] += 1;
    } else {
      anomalies[d.year] = Array(length).fill(0);
      anomalies[d.year][indexF(d.corrected)] = 1;
    }
    return anomalies;
  }, []);

Inputs.table(anomalies.filter( (e,i) => i >= 1940))
1
The indexF function is used to discretize the anomalies into 0.1 intervals
2
The indexF_1 function is used to get back the original value from the discretized value

Smoothing the Data with a Gaussian Blur

To smooth the data we apply a Gaussian blur to the anomaly count by computing a weighted average neighboring cells. The kernels are precomputed for 5 different sizes of blur using a binomial distribution. Only the positive side of the kernel for simplicity. You can visualize the kernels in Figure 5.

kernels = [
      [1],
      [2, 1],
      [6, 4, 1],
      [20, 15, 6, 1],
      [70, 56, 28, 8, 1],
    ];
kernelVisu = kernels.flatMap( (kernel, i) => 
      kernel.flatMap( (e, j) => (j == 0)? Array(e).fill( {i, j } ) : Array(e).fill( [{i, j }, {i, j: -j} ]) ) )
      .flat()
      
Plot.plot({ 
  marks: [
    Plot.barY(kernelVisu, 
      Plot.binX({y: "proportion-facet"}, 
        { x: "j"
        , fx: "i"
        , stroke: "black"
        , fill: "black"}))
      , Plot.axisY({label: "Weight", tickFormat: "%"})
 
  ]
  , grid: true
  , x:
    { tickFormat: (d) => d == 0 ? "0" : Math.floor( +d)
    , label: "Kernel Offset"}
  , fx: 
    { label: "Kernel Size"
    , anchor: "bottom"
  }
  , y: 
    {inset: 5
    , label: "Weight"
    , tickFormat: "%"
  }
})
(a)
(b)
Figure 5: Kernel Visualization
joyData =  {
  //  Gaussian Blur smoothing 
  let joyData = [];
  for (let year in anomalies) { 
    const kernel = kernels[smoothFactor];
    for (let i = 0; i < length; i++) {
      let result = 0;
      let weight = 0;
      for (let j = 1 - kernel.length; j < kernel.length; j++) {
        if (i + j >= 0 && i + j < length) {
          const absJ = Math.abs(j);
          result += anomalies[year][i + j] * kernel[absJ];
          weight += kernel[absJ];
        }
      }
      joyData.push({ 
        year: year,
        anomaly: indexF_1(i), 
        count: result / weight,
        height: result / weight + (maxYear - year) * yearOffset,
        reference: (maxYear - year) * yearOffset,
      });
    }
  }
  return joyData.filter((d) => d.year != maxYear);
}

joyData
1
From one of 5 kernels for gaussian blur of size 1, 2, 3, 4 and 5.
2
To obtain the Joy Division Plot effect the height the count is offseted by the year (reversed so that 2024 is at the bottom)
3
2025 is removed from the data as it is not complete
serialize = {
  const xmlns = "http://www.w3.org/2000/xmlns/";
  const xlinkns = "http://www.w3.org/1999/xlink";
  const svgns = "http://www.w3.org/2000/svg";
  return function serialize(svg) {
    svg = svg.cloneNode(true);
    const fragment = window.location.href + "#";
    const walker = document.createTreeWalker(svg, NodeFilter.SHOW_ELEMENT);
    while (walker.nextNode()) {
      for (const attr of walker.currentNode.attributes) {
        if (attr.value.includes(fragment)) {
          attr.value = attr.value.replace(fragment, "#");
        }
      }
    }
    svg.setAttributeNS(xmlns, "xmlns", svgns);
    svg.setAttributeNS(xmlns, "xmlns:xlink", xlinkns);
    const serializer = new window.XMLSerializer;
    const string = serializer.serializeToString(svg);
    return new Blob([string], {type: "image/svg+xml"});
  };
}

References

C3S. (2018). ERA5 hourly data on single levels from 1940 to present [Dataset]. Copernicus Climate Change Service (C3S) Climate Data Store (CDS). https://doi.org/10.24381/CDS.ADBB2D47
Poynting, M., & Rivault, E. (2024, January 9). 2023 confirmed as world’s hottest year on record. BBC News. https://www.bbc.com/news/science-environment-67861954
ReesCatOphuls. (2024, June 4). Copernicus 1850-1900 Baseline – Daily GMST Anomaly - Paris Agreement Temperature Index. https://parisagreementtemperatureindex.com/copernicus-1850-1900-baseline-daily-gmst/

Citation

BibTeX citation:
@misc{masson2025_UnknownWarming,
  author = {Masson, Dimitri and Girard, Yann},
  title = {Unknown {Warming:} {A} Joy Division Inspired Plot of
    {Temperature} {Anomalies}},
  date = {2025-02-03},
  url = {https://dhmmasson.github.io/projects/DataViz/temperatureAnomalies.html},
  langid = {en}
}
For attribution, please cite this work as:
Masson, D., & Girard, Y. (2025, February 3). Unknown Warming: A joy division inspired plot of Temperature Anomalies. Dhmmasson.github.io . https://dhmmasson.github.io/projects/DataViz/temperatureAnomalies.html
  • © 2024 Dimitri Masson
 
  • Built with Quarto,
  • Source code available on GitHub