Under the Uzès Sun: When Historical Data Reveals the Climate Change

Editor
16 Min Read


, I am biologically required to endure the same loop of small talk every year: “It’s boiling, isn’t it? Way hotter than 2020,” or the classic, “Back in my day, we actually had four seasons, not just ‘Pre-Oven’ and ‘Deep Fryer.’”

Honestly, I’m tempted to nod along and complain too, but I have the memory of a goldfish and a brain that demands cold, hard facts before joining a rant. Since I can’t remember if last July was “sweaty” or “molten,” I’d love to have some actual data to back up my grumbling.

I work at icCube. It’s basically a professional sin for me to get into a data-driven argument without bringing enterprise-level tooling to a back-of-the-napkin debate.

At the next apéro, when someone starts reminiscing about how “1976 was the real scorcher,” I shouldn’t just be nodding politely while nursing my pastis. I should be whipping out a high-performance, pixel-perfect dashboard that visualizes their nostalgia right into oblivion. If I can’t use multi-dimensional analysis to prove that our sweat glands are working harder than they did in the seventies, then what am I even doing with my life?

While this journey began as a quest to settle a local argument in the South of France, this post goes beyond the climate debate. It serves as a blueprint for a classic data challenge: how to architect a high-performance analytical system capable of making sense of decades of historical data applicable to any domain requiring historical vs. current benchmarking.

The Battle Plan

Here is the plan mapping out our tactical strike against vague nostalgia and anecdotal evidence:

  1. Scouting the Intel: Hunting down the raw numbers because “it feels hot” isn’t a metric, and we need the high-octane stuff.
  2. Building the War Room: Architecting a structure robust enough to hold decades of heatwaves without breaking a sweat.
  3. The Analytical Sledgehammer: Deploying the heavy-duty logic required to turn raw data into undeniable, nostalgia-incinerating proof.
  4. The Visual “I Told You So”: Designing the pixel-perfect dashboard to end any apéro argument in three seconds flat.
  5. Post-Victory Lap: Now that we’ve conquered the climate debate, what other domestic myths shall we incinerate with data?

Scouting the Intel

Data is central to our mission. Therefore, we need to secure accurate, high-fidelity historical temperature records from France.

Méteo-France, the national meteorological and climatological service, is a public establishment of the State. It makes available to all users the data produced as part of its public service missions in its public data portal: datagouv.fr. God bless public data portals. While half the world’s data is locked behind paywalls and registration forms that ask for your blood type, France just… hands it over. Liberté, égalité, température.

The data used in this post is made available under the Open License 2.0.

The Observations

Climatological (daily/hourly) data from all metropolitan and overseas weather stations since their opening, for all available parameters. The data have undergone climatological control: www.

The Weather Stations

Characteristics of meteorological weather stations in metropolitan France and overseas territories in operation: www.

Early Analysis & Transformations

Being like Saint-Thomas, I like to see and review a bit by myself the actual data to get first a good understanding and perform a bit of sanity checks before drawing any conclusions later on.

To keep things clean, I’ve been extracting raw temperature data from the pile of observations we have. Being an unrepentant Java geek, I’ve built a collection of classes for this mission and tossed them into a Github project. Feel free to tear through the code, re-use it as much as you like.

I’m not going to bore you with a dry lecture on the data right now. That would be like serving a lukewarm rosé, absolutely criminal, possibly illegal in certain Provençal villages.

I’ll be diving into the gritty details when needed.

Building the War Room

If we’re going to settle these terrace debates once and for all, we can’t just turn up with a spreadsheet and a dream. We need an OLAP schema; a structure so robust it makes the local historical stone masonry look flimsy. We’re keeping it lean for this specific fight, but trust me, it’s built to scale when the next “mildest winter ever” argument inevitably breaks out.

Let’s break down the architecture.

The Dimensions

  • Stations: It lets us pinpoint the exact weather station in the France map because saying “somewhere in the South” won’t cut it. We need coordinates, names, the works.
  • Time/Calendar: The usual suspects: years, months, days. Boring? Sure. Essential for proving your neighbor’s memory is garbage? Absolutely. We’re tossing in Months and Days of Month to fuel a calendar widget that will let me point at any specific date and say: “See? July 1st, 2025 was an absolute hellscape”. Precision is key when you’re ruining someone’s nostalgic buzz.

The Facts (aka., Measures)

  • Temperatures: The “Holy Trinity” of data points—Average, Maximum, and Minimum. This is the primary input for our “Deep Fryer” versus “Pre-Oven” analysis.

The full schema definition is parked over in the GitHub project with the source code, ready for when you’re feeling particularly vengeful.

The Cube

The final result? A loaded schema containing more than 500 million rows of French temperature data stretching back to 1780. Is it absolute overkill for a casual chat over olives? Of course it is. That’s the point.

It gives us a playground to hack into other metrics later on. But let’s save those for when we really want to make people regret bringing up the weather in the first place.

The Analytical Sledgehammer

Time to build the query that will shut down the next apéro debate in three seconds flat.

To cut through the noise, I’m using the MDX language: a query language specifically designed for this kind of multi-dimensional heavy lifting. To prove that we are indeed living in a “Deep Fryer”, I’m going to compare each day’s temperature against a historical reference period.

If you don’t speak MDX, skip to the pretty picture. The query basically tells the data engine to find the average “normal” for this specific day over 30 years and subtract it from today’s temperature.

First, the reference period (aka., our normal baseline) is defined as a static set using the range operator (e.g., 1991 – 2000):

with
  static set [Period] as { 
    [Time].[Time].[Year].[1991] : [Time].[Time].[Year].[2020] 
  }

“Why 30 years?” Because that’s what climatologists and the World Meteorological Organization decided counts as “normal” before the planet started experimenting with new thermostat settings. It’s the gold standard for a “climatological normal”; long enough to smooth out the weird years, short enough to still remember what “normal” used to feel like.”

The daily average temperature is defined as the average of the maximum and minimum temperatures of the day. I’ve experimented with hourly averages; the results are nearly identical. So let’s stick to this simple and well accepted definition:

with
  [T_Avg_Daily] as 
    ( [Measures].[Temperature (max.)] + [Measures].[Temperature (min.)] ) / 2
    , FORMAT_STRING=".#"

Now, we need to know what the temperature should be. We calculate the average of those daily temperatures aggregated over our reference period:

with
  [T_Avg_Period] as 
    avg( [Period], [T_Avg_Daily] )
    , FORMAT_STRING=".#"

Finally, we calculate the difference, measuring exactly how much hotter (or colder) it is today compared to my past years. This delta value puts a precise number on our collective sweat:

with
  [T_Avg_Diff] as 
    IIF( isEmpty( [T_Avg_Daily] ), null, [T_Avg_Daily] - [T_Avg_Period] )

Putting all together, here is MDX query that compares the 2025 daily temperatures in Uzès against the record:

with
  static set [Period] as { 
    [Time].[Time].[Year].[1991] : [Time].[Time].[Year].[2020] 
  }

  [T_Avg_Daily] as 
    ( [Measures].[Temperature (max.)] + [Measures].[Temperature (min.)] ) / 2
    , FORMAT_STRING=".#"

  [T_Avg_Period] as 
    avg( [Period], [T_Avg_Daily] )
    , FORMAT_STRING=".#"

  [T_Avg_Diff] as 
    IIF( isEmpty( [T_Avg_Daily] ), null, [T_Avg_Daily] - [T_Avg_Period] )

select
  [Time].[Months].[Months] on 0
  [Time].[Days of Months].[Days of Months] on 1
  
  from [Observations]

where [T_Avg_Diff]

filterby [Time].[Time].[Year].&[2025-01-01T00:00:00.000]
filterby [Station].[Station].[Name].&[30189001] -- Nîmes Courbessac

The attentive reader will notice I’ve swapped the local Uzès station for the Nîmes-Courbessac station. Why? Because I need that sweet, sweet historical data to fuel my “back in my day” comparisons, and Nîmes simply has a longer memory. It’s right next door, so the temperatures are virtually identical though, if I’m being honest, Nîmes usually runs a bit hotter.

Image by the author.

In the next section, I’ll show you how to splash some color on these values so you can spot the heatwaves at a glance.

The Visual “I Told You So”

So it’s time to stop staring at raw code and actually build a visual for that MDX result. My plan? Cram the entire year into a single 2D grid, because looking at a scrollable list of 365 dates is a one-way ticket to a migraine.

The setup is simple: months across the horizontal axis, days of the month on the vertical. Each cell represents the temperature delta, that is, the (Celsius degrees) difference between 2025 and our reference period. To make it “idiot-proof” for the next time I’m three pastis deep, I’ve applied a heat map: the hotter the day was compared to the past, the redder the cell; the colder, the bluer.

Full disclosure: I am not a “visual guy.” My aesthetic preference usually begins and ends with “does the query return in under 50 milliseconds?” But even with my lack of artistic flair, the data speaks for itself.

Image by the author.

One glance at this grid and it’s painfully clear: 2025 isn’t just “a bit mild.” It’s a sea of angry crimson that proves our reference period belongs to a world that was significantly less “pre-oven.” If this doesn’t shut down the “back in my day” crowd at the next apéro, nothing will.

My Nostalgia Past Years (1980-2000)

I’m recalibrating the baseline to match the years of my youth. By shifting the reference period to those “glory days,” it turns out my brain wasn’t exaggerating; the data confirms a clear shift from the manageable summers of the past to this new intensity.

Image by the author.

No wonder the lavender is stressed.

#Days > 35

I started getting curious; was it just my imagination, or is the “oven” setting on this planet actually speeding up? I decided on a quick exercise: counting how many days per year the thermometer hits or cruises past the 35°C mark.

Image by the author.

To the surprise of absolutely nobody, the data confirms the “pre-oven” phase is shrinking, and the “deep fryer” era is officially taking over.

2003: When Summer Became a Tragedy

There, in the data, a stark peak that towers above all others. The summer of 2003. Fifteen thousand people didn’t survive those relentless days above 35°C. In France alone. A nation that hadn’t understood how deadly heat could be. The chart doesn’t capture the empty chairs at dinner tables that autumn, the families forever changed, the realization that came too late.

These charts don’t prove global climate change on its own; they simply prove local lived reality with rigor.

Post-Victory Lap

And that is how you turn a casual sunset drink into a data-driven interrogation.

We’ve officially unleashed the data and MDX to prove that “it used to be cooler” isn’t just a senior citizen grumbling after one too many Ricards; it’s a verifiable fact. Is bringing a multi-dimensional heatmap to a social gathering the fastest way to lose friends and stop getting invited to apéros? Probably. But is the silence that follows a perfectly executed “I told you so” worth it? Every single time.

Data won’t stop the heat but it will hopefully stop the bad arguments about it.

The “Mistral Madness” Index

Now that the heat is settled, I’m setting my sights on the legendary Mistral. In every village square from Valence to Marseille, there is a sacred “Rule of 3” that says once the Mistral starts, it must blow for 3, 6, or 9 days. It’s the kind of local numerology that people defend with their lives.

I’m already prepping a new “Wind-Chill” schema to cross-reference hourly gust speeds with this calendar myth. I want to see if the wind actually cares about multiples of three, or if it’s just our brains trying to find patterns in the chaos while our shutters are rattling.


If you’ve enjoyed watching me over-engineer a solution to a casual conversation, follow my descent into analytical madness over on Medium. We’re just getting started.

Share this Article
Please enter CoinGecko Free Api Key to get this plugin works.