Alteryx, data visualisation, Maps, Tableau

Alaska Fried Chicken: the UK’s curious approach to naming chicken shops.

I went a little bit viral a couple of weeks ago when I tweeted about chicken shops in the UK which are named after American states which aren’t Kentucky. If I’d thought about it, I’d have written this blog up first, created a Tableau Public viz, and had all kinds of other shit ready to plug once I started getting some serious #numbers… but I didn’t. So, to make up for that, this blog will go through that thread in more detail and answer a few questions I received along the way.

It all started when I walked past Tennessee Fried Chicken in Camberwell, pretty close to where I live. It’s clearly a knock-off KFC, and I wanted to know how many other chicken shops had the same name format: [American state] Fried Chicken.

The first thing to do is to get a list of all the restaurants in the UK. I spent a while wondering how to get this data, but then I remembered that my colleague Luke Stoughton once built a Tableau Public dashboard about food hygiene ratings in the UK. All UK chicken shops – hopefully! – are inspected by the Food Standards Agency. So, Luke kindly showed me his Alteryx workflow for scraping the data from the FSA API, and I adjusted it to look for chicken shops.

My first line of inquiry is pretty stringent: how many chicken shops in the UK are called “X Fried Chicken” where X is an American state which isn’t Kentucky?

Turns out it’s 34. “Tennessee Fried Chicken” – including variants such as Tenessee and Tennesse – is the most popular with 13 chicken shops. The next highest is Kansas with six, which I’m assuming is so the owners can refer to their shops as KFC, although maybe the owner/s just really like tornadoes, wheat, and/or the Wizard of Oz. Then there’s four Californias, a couple of Floridas, and one each of Arizona, Georgia, Michigan, Mississippi, Montana, Ohio, Texas, and Virginia.

1 state fried chicken map

[tangent: I’m aware that a lot of these states aren’t exactly famed for their fried chicken, but as a Brit, all I have to go on for most of them are my stereotypes from American media. But hey, maybe it’s still accurate, and Ohio Fried Chicken tastes of opiates and post-industrial decline, Arizona Fried Chicken comes pre-pulped for the senior clientele who can’t chew so well these days, and Florida Fried Chicken is actually just alligator. Michigan Fried Chicken is, I dunno, fried in car oil rather than vegetable oil, and Alaska Fried Chicken is their sneaky way of dealing with the bald eagle problem up there? I’m running out of crude state stereotypes now, I’m afraid. Out of all these states, I’ve only actually been to California.]

There’s also a “DC Fried Chicken”, which is close but not quite close enough for me, and a “South Harrow Tennessee Fried Chicken”, which I’m not counting because either.

Here is where these American State Fried Chicken shops are in the UK:

2 map uk

Interestingly, this isn’t a case of a map simply showing population distributions. The shops cluster around the London and Manchester regions, but with almost none in any other major urban centre.

Let’s have a look at the clusters separately. Here’s the chicken shops around the Manchester area:

2.1 map greater nw

None of them are in the proper centre of Manchester itself, but they’re in the towns around. One town in particular stands out: Oldham. Let’s have a look at the centre of Oldham:

2.2 oldham only

Oldham, you’re fantastic. There are six separate “X Fried Chicken” shops in Oldham, and four of them – Georgia, Michigan, Montana, and Virginia – are the only ones by that name in the whole country.

For comparison, here’s the London area:

2.3 greater london area only

This is where all the Tennessees are, as well as the one Texas and Mississippi.

It looks like there’s a lot more variety in the north of England compared to the south, and sure enough, a split emerges:

3 latitude scatterplot

[chicken icon from

Chicken shops in the south of England (and that one Tennessee place in Wales) tend to name their shops after states in the geographical south of the USA, while chicken shops in the north of England name their shops after any states they like.

This is where my initial Twitter thread ended, and I woke up the next day to a lot of comments like “Y IS THEIR NO MARYLAND THEIR IS MARYLAND CHICKEN IN LEICESTER”. Well, yeah, but it’s not Maryland Fried Chicken, is it?

So I re-ran the data to look at chicken shops with an American state in the name. This is the point at which it’s hard to tell if there’s any data drop out; the FSA data categorises places to inspect as restaurants, takeaways, etc., but not as specifically as chicken shops. All I’ve got to go on is the name, so I’ve taken all shops with an American state and the word “chicken” in the name. This would exclude (sadly fictional) places like “South Dakota Spicy Wings” and “The Organic Vermont Quail Emporium”, but it’d also include a lot of false positives; for example, you’d think that taking all takeaway places with “wings” in the name would be safe, but when I manually checked a few on Google Street View (because I’m dedicated to my research), about half of them are Chinese and refer to the owner’s surname, not the delicacy available.

This brings in a few more states – Marlyand, New Jersey, and Nevada:

4 state chicken map

Let’s have another look at the UK’s south vs north split. We’ve got a bit of midlands representation now, with the Maryland Chickens in Leicester and Nottingham, the Nevada Chickens in Nottingham and Derby, and a California Chicken & Pizza near Dudley. The latitude naming split between the south/midlands and the north isn’t quite as obvious anymore:

5 latitude with no fried restriction

…but, there is still a noticeable difference. This graph shows each chicken shop with an American state and the word “chicken” in the name, ordered by latitude going south to north:

6 north vs midlands and south

In the south and the midlands, there’s the occasional chicken shop that’s going individual – there’s the Texas Fried Chicken in Edmonton, the two Mississippi places in London which don’t seem to be related (Mississippi Chicken & Pizza in Dagenham, Mississippi Fried Chicken in Islington), the Kansas Chicken & Ribs place in Hornsey is almost definitely a different chain from the six Kansas Fried Chicken shops in and around Manchester, and the California Fried Chicken in Luton is probably independent of the California Fried Chickens on the south coast – but most of them are Tennessee or Maryland chains in the same area. In all, the south and midlands have 17 chicken shops named after 8 American states (excluding Kentucky), or a State-to-Chicken-Shop ratio of 0.47.

In the north, however, there’s a proliferation of independent chicken shops – 15 shops named after 9 different states (excluding Kentucky), or a State-to-Chicken-Shop ratio of 0.6. There’s the chain of six Kansas Fried Chicken places and two Florida Fried Chicken places in Manchester and Oldham, but the rest are completely separate. Good job, The North.

The broader question is: why does the UK do this? There’s obviously the copycat nature of it; chicken shops want to seem plausible, and sounding like a KFC (and looking like one too, since they’re almost always designed in red/white/blue colours) links it in people’s minds. I think there’s more to it, though. Having a really American-sounding word in the name is probably a bit like how Japanese companies scatter English words everywhere to sound international and dynamic (even if they make no sense), or how Americans often perceive British names and accents as fancier and more authoritative (even if to British ears it’s somebody from Birmingham called Jenkins). We’re doing the same, but… for fried chicken.

Finally, since this data is all from the Food Standards Agency’s hygiene ratings, it’d be a shame not to look at the actual hygiene ratings:

7 hygiene

It looks like independently-named chicken shops named after American states in the north are more hygienic. The chains in the south and midlands – Tennessee, Maryland, California, and especially New Jersey – don’t have great hygiene ratings, and the independent shops do pretty badly too. In contrast, the chicken shops in the north score highly for cleanliness. In fact, a quick linear regression of hygiene onto latitude gives me an R2 of 0.74 and a p-value of < 0.0001. Speculations as to why this is on a postcard, please.

Preëmpting your questions/comments:

“I live in […] and my local shop […] isn’t mentioned!”
Maybe you’re talking about a Dallas Chicken place. That’s not a state. Nor is Dixy Chicken, it just sounds a bit American. If it’s definitely a state, then does it have chicken in the name? If not, I won’t have picked it up. I also haven’t picked up shops which have, say, “Vermont Fried Chicken” written on the shop sign if it’s registered in the database as “VFC”. Same with if the state is misspelled, either by the shop or by the data collectors. If it’s all still fine, perhaps the shop is so new that it hasn’t had an inspection… or perhaps the shop is operating illegally and isn’t registered for a hygiene inspection.

“Did you know about Mr. Chicken, the guy who designs the signs?”
I didn’t, but I do now! He’s brilliant.

“How did you do all this?”
I use Alteryx for data scraping/preparation and Tableau for data visualisation.

“I have an idea for something / I want to talk to you about something, can I get in touch?”
Please do! My Twitter handle is @GwilymLockwood, or you can email me on

“Your analysis is amazing, probably the best thing I’ve ever seen with my eyes. Where can I explore more of your stuff?”
Thanks, that’s so kind! There’s a lot of my infographic work on my Tableau Public site here.

Education, Open Data, R

The gender gap in school achievement: exploring UK GCSE data

I was reading this article in the Washington Post a couple of days ago. It’s about data from Florida which shows that girls outperform boys at school, and that the gender gap is bigger at worse schools.

It’s well established that girls outperform boys at school, but seeing it visualised and quantified like that was fascinating, and I wanted to reproduce that data for UK schools. We frequently use American statistics to talk about social issues in the UK, which frustrates me; sometimes we’re close enough for it to generalise, but sometimes it doesn’t and it’s like there’s a gigantic metaphorical ocean between the two societies. We know that British girls outperform British boys, but I wanted to see how similar the situation is.

Luckily, the UK government has one of the best records for open data in the world, and so this information is pretty easily found here and here. The main challenge is actually getting through all the data to find the good bits, as so much of it is available, but I found it in the end. So, I shoved all that into R and messed about with some dataframes. Note that I’m not working with private schools here, just state schools… all 2488 of them which have full data for all metrics reported below. Also, all the data is only fully available for England, not the whole of the UK.

The first thing is to decide how to measure achievement. Here, I’m focusing on GCSEs, the standard qualification which most UK teenagers take at 16 and which marks the end of mandatory education. There are two good metrics for measuring GCSE achievement: the percentage of students who get at least five A*-C grades, and the average capped GCSE point score. The first is simple. Students generally take GCSEs in somewhere between seven and ten different subjects, and the percentage of them who score a grade C or above in at least five GCSEs is one of the main metrics that British people obsess over (for people outside the UK, I’m serious, the national newspapers print this figure for all state schools every August when exam results come out). The second is a little more complicated, and it’s explained here. It’s measured by attributing a certain number of points per exam grade (58 for an A*, 52 for an A, and so on down in sixes). It then measures only a student’s top eight GCSEs. So, if you took 11 GCSEs, scored 6 A*s, 4 As, and a B, you’d get 6 x 58 plus 2 x 52 equals 452. This is then averaged across the school. Literally nobody outside government departments ever uses this, but it’s actually a pretty good measure; focusing on the five A*-C rate is a bit blind to quality over quantity, as a student who gets four A*s and four Ds harms the school’s statistics while a student who gets five Cs and three Fs is good for the school’s statistics, despite the first student clearly doing better overall.

The next thing is to decode the wording of the original article: “the gender gap is bigger at worse schools”. There are several ways of talking about what makes a school good or bad, so I’ll focus on three different metrics:

  1. The rating given to each school by the assessment organisation Ofsted. Each school is inspected every couple of years, and gets given an overall grade: outstanding, good, requires improvement, or inadequate. This is a useful, state-sanctioned measure of how good a school is.
  2. The average GCSE achievement data per school. Presumably, better schools get better results. This is a useful measure of how good a school is in terms of what many parents say they care about.
  3. The average wealth of the student body at the school. Let’s face it, when a lot of middle-class British people say “we were lucky enough that our son got into a good local school”, what they actually mean is “we’re so glad there’s no poor people there”. We can measure the average wealth of the student body by looking at the percentage of students who are eligible for free school meals. The higher the percentage, the poorer the student intake.

Firstly, let’s look at the gender gap in GCSE achievement by Ofsted data. This is categorical, so we can have some nice straightforward histograms. Boys are in light blue, girls are in dark pink. Sure, it’s gendered, but it’s an effective and intuitive colour scheme.

histogram of five A star to C rate and each sex per ofsted rating.png

As you’d expect, the outstanding schools get better results than the good schools, and so on and so on. But, it seems that girls outperform boys across the board, regardless of how good the school is (I did an ANOVA on this; the gender gap effect is slightly less for outstanding schools, but it’s a negligible difference. The gender gap at outstanding schools is 7.5 percentage points versus about 8.5-9.5 percentage points for the other three assessments).

histogram of GCSE capped points score and each sex per ofsted rating.png

…and this is mirrored in the capped GCSE points average. Again there’s a tiny bit less of a gender gap in the outstanding schools compared to the rest, but girls do better than boys everywhere.

Right, so much for Ofsted. Let’s look at overall school GCSE achievement. This is continuous, so it’s going in a scatter plot. Plotting every single school’s boys’ and girl’s result was really messy, so this averages across schools on each percentage point on the x-axis (i.e. what you see at 50% is the average boys’ five A*-C rate and the average girls’ A*-C rate across all schools which got a 50% overall five A*-C rate). Likewise in the second plot with every single capped GCSE average points score, where each points score on the x-axis is rounded to a whole number and averaged with others of the same number. Rest assured that the lines of best fit are essentially identical in the larger, messier plots. I did do plots with standard errors, but thought I’d forgotten… then I looked closely, and realised that the standard errors were so small that they were barely distinguishable from the lines.

scatterplot of GCSE results for each sex across GCSE results.png

scatterplot of capped GCSE results for each sex across rounded school capped GCSE results.png

This one tells a clear story, and is very, very similar to Figure 1 in the Washington Post article which shows the standardised maths and reading assessment plot. However, there are two main differences:

  1. If anything the very worst schools seem to have less of a gender gap, especially in the five A*-C rate plot … although this is probably more about a lack of data at that end. (this is one of the few times I think it’s a good thing to have a lack of data)
  2. It basically doesn’t matter how good or bad the school is, the difference between boys and girls is consistent across all levels of achievement. The only place where boys and girls are almost equal is right at the top, where there’s a ceiling effect; assuming that each school is 50% boys, 50% girls, there can’t be a big difference between the two if a school is getting 99% five A*-Cs overall.

And now for the free school meals data, or the middle-class poverty aversion question. I’m going to bombard you with graphs here. First, just to show you, here are the messy ones where all rates for all schools are plotted:


…but like I said, it’s messy and hard to focus on, it’s like somebody spilt muisjes on the screen.

So, here’s the same plots but with all schools averaged together at each data point. This isn’t even at each percentage point, it’s to the nearest 0.1 of a percentage point, because there’s that much data.

scatterplot of GCSE results for each sex acrossfree school meal eligibility rate (loess se).png

scatterplot of average capped GCSE results for each sex across free school meal eligibility rate (loess se).png

This also tells a very clear story. The schools with richer students get better results. I also found out the Pope’s religion, and something about bears and woods. But, again, there are the same two main points:

  1. There seems to be less of a gender difference in achievement at worse (well, poorer) schools, but this is probably because there aren’t that many seriously deprived areas. Not to say we don’t have deprivation in the UK, we definitely do, and it’s growing, but there are very few schools where over half the students qualify for free school meals (which probably says more about our ridiculously strict benefits threshold rather than the state of poverty).
  2. The performance and achievement gap remains even at the very best (well, richest) schools.

There’s also race data available, but I feel like that’s a topic for another blog at another time. This one is already long enough!

The point is this: while the Washington Post article was fascinating, it doesn’t fully generalise to British society. In the UK, the gender gap for school achievement barely gets bigger at “worse” schools, regardless of how you measure what a bad school is… which is a good thing, I guess? In fact, the gender gap for school achievement seems to be entrenched across education achievement and wealth.

Are girls outperforming boys, or are boys lagging behind? Is it both? I’m not an education specialist, I’m just a guy with Rstudio, so I’m reluctant to speculate… but I will anyway.

I think what I’ve ruled out here is any obvious overriding education level or socio-economic effects of the gender achievement gap. It could be that girls are simply more intelligent than boys, although such a simplistic solution seems unlikely. It could be a social peer pressure effect, in that it is more acceptable to be feminine and work hard at school than it is to be masculine and work hard at school (although that wouldn’t explain the reports that this gender difference is present at very, very early ages). It could be that teaching is a female-dominated profession; female teachers may knowingly or unknowingly choose course materials preferred by girls over materials preferred by boys, female teachers may knowingly or unknowingly favour, reward, and encourage problem-solving strategies preferred by girls over strategies preferred by boys, etc. etc., and that this may get entrenched over time. It could be that a culture which encourages and promotes girls’ education, given their denial of access to it until relatively recently, accidentally creates a culture where boys feel undervalued and demotivated. It could be that girls collaborate with each other on homework and exam revision more than boys do, which has been shown to effectively improve learning. It could be that exams favour a stereotypical female attention to detail over a stereotypical male “good enough” approach. It could be that more boys than girls simply don’t give a shit about their handwriting enough to make their answers legible. It could be that girls hit puberty a bit earlier than boys and are therefore out of adolescence a bit earlier than boys, meaning that girls are on average more mature when they take their GCSEs (but again, not if there’s an early years difference too).

It’s probably all of the above, and more, and it’s complicated. And it’s a problem.