Ruby
 

How well do you know your airport codes?

By: Virgin Atlantic

November 22, 2019

LAX sign at Los Angeles International Airport © Wikimedia Commons

LAX sign at Los Angeles International Airport © Wikimedia Commons

Join us in our quest for ultimate geek status and discover everything there is to know about airport codes

Think you know your DAD from your LAD or your BOB from your SOB, or even your ARS from your LBW? These three-letter conundrums can be something of a mystery, so if you’re cramming for an upcoming Christmas pub quiz, our guide to the world of airport codes will stand you in good stead.

The practice of geocoding originated in the 1930s as a way of streamlining location identification for pilots. Up until the 1940s, airports had mostly been assigning themselves a two-letter code, but with the rapid expansion of aviation around the globe, a new, standardised process became necessary to avoid confusion and potential duplication.

Today the codes are assigned and governed by the International Air Transport Association (IATA) who publish a complete, bi-annual list from their headquarters in Montreal. The three-letter geocodes are chosen first and foremost to be unique. Where possible they’re based on the airport name or the destination itself, or another pertinent detail if the most obvious code cannot be used for a particular reason. This has led to some baffling letter combinations over the years, but behind every seemingly random choice there’s usually a logical explanation.

Below, we look at the background of some of the codes, including a handful of destinations on our own route map. But before we begin, let’s start with a quick quiz. Think your knowledge of obscure airport codes is on a par with the experts here at Virgin Atlantic? Here’s your chance to shine.*

Welcome to SFO © Håkan Dahlström / Flickr Creative Commons

Quickfire quiz

  1. SUX
  2. GPS
  3. GLO
  4. CAN
  5. YUM
  6. ORK
  7. NAN
  8. MAO
  9. LBJ
  10. GUM

*answers at the end of this post.

HKG, or Hong Kong International Airport, at sunset © Eddie Yip / Flickr Creative Commons

Airport code oddities

~ Frequent flyers will be familiar with the most well known (and self-explanatory) codes on our route map – JFK, LAX, BOS, SEA, MIA for starters – but what about some of the less obvious ones? When you’re flying to Washington DC, you might reasonably ask why your baggage tag says IAD. That’s because the code was originally DIA for Dulles International Airport, but it would often get mistaken for DCA when written by hand – which is the same code as the city’s other major airport, Reagan International. To avoid confusion, the letters in DIA were switched to IAD. Orlando’s airport code, MCO, is also perplexing – until you learn the airport was previously a military base named after a colonel called Michael Norman Wright McCoy. And EWR airport in New Jersey uses three letters from its destination city of Newark, but does not contain the first letter because domestic USA codes beginning with ‘N’ are reserved by the United States Navy – see also MSY (New Orleans), and BNA (Nashville).

~ Have you ever noticed how nearly all Canadian airport codes begin with the letter Y? There’s YVR for Vancouver, YYC for Calgary, YOW for Ottawa and so on. But why Y? It’s complicated. Back in the day, Canada identified its weather/radio beacons with two letter codes, and when IATA began requiring three-letter codes, the letter Y was added to the beginning if the radio beacon was located on airport grounds (the Y stood for ‘yes’) and W if it was not (the W stood for ‘without’). Today almost all airport codes in Canada start with a Y, with a handful of minor exceptions including tiny Bearskin Lake Airport (XBE) in Ontario and Shamattawa Airport (ZTM) in Manitoba.

~ Just to muddy the waters, airports also have a four-letter International Civil Aviation Organization (ICAO) code, which are mainly used by air traffic control and internally within airlines for flight planning purposes. ICAO codes are entirely separate from IATA codes, and not generally seen or used by the public. They’re structured in a different way – usually regionally, offering geographical context – with the first letter representing a continent, the second a country, and the remaining two used to identify specific airports. Take London Heathrow for example, commonly abbreviated to its IATA code of LHR. Its ICAO code is EGLL, which breaks down as E for Northern Europe, G for the United Kingdom, and LL as an identifier for Heathrow Airport.

So unlike IATA codes, if you know the regional designations of ICAO codes you can work out the country in which a given airport is located. You’d know that EGCC and EGHH were in the United Kingdom, for example, but you might not know they related to Manchester and Bournemouth airports respectively. But don’t spend too much time swotting up on these, as there are numerous exceptions to this rule!

Talking of swotting, how did we get on in the quiz above? For anyone who hasn’t already peeked, the answers are below:

1) SUX is Sioux Gateway Airport in Sioux City, Iowa. Despite spending many years bemoaning the ’embarrassing’ code, locals have now taken it to their hearts with “Fly SUX” merchandise available in gift shops across the city.

2) GPS has nothing to do with satellite radio-navigation systems in this instance – it’s the airport code for Seymour Airport on the island of Baltra, one of the Galápagos Islands of Ecuador.

3) The code GLO might evoke memories of dodgy raves and neon glo-sticks (are we showing our age?) but this sedate little airport couldn’t be further from all that. It’s Gloucestershire Airport near Cheltenham in the UK, which mostly caters to private jets and helicopters.

4) CAN stands for Guangzhou in China, and is a classic example of a city that’s retained its previous name (Canton) in its airport code. See also: BOM (Mumbai, formerly Bombay), PEK (Beijing, formerly Peking), RGN (Yangon, formerly Rangoon), SGN (Ho Chi Minh City, formerly Saigon) and LED (St. Petersburg, formerly Leningrad).

5) YUM busts the myth that only Canadian airport codes start with a Y. In this case, it stands for Yuma International Airport in Arizona, a joint military-civilian use airport which also houses a medical repatriation service.

6) ORK: If your first guess was the island of Orkney, you need to sharpen your game (that would be KOI for Kirkwall Airport). This one’s fairly obvious if you think about it: It’s Cork Airport in the Republic of Ireland.

7) NAN is an interesting one. It’s the code for Nadi International Airport on the island of Fiji, and is based on the phonetic pronunciation of Nadi (“Nandi”) rather than the spelling.

8) MAO: Two clues – it’s not an airport named after Chairman Mao, and neither is it the eponymous code of Mao Airport in Chad. MAO stands for Manaus in Brazil, and gets its letters from the alternative spelling of Manáos.

9) LBJ has nothing whatsoever to do with the 36th president of the United States, and everything to do with Komodo Airport on the Indonesian island of Flores, which is located near the town of Labuan Bajo.

10) GUM is the code for Antonio B. Won Pat International Airport in Tamuning and Barrigada. None the wiser? The airport serves the United States territory of Guam in Micronesia. We’re pretty sure you all got that one.

How did you do? Count us impressed if you got more than eight out of 10!

Now you’re an expert, you can book a flight to one our our exciting global destinations simply by its code. We fly direct to LAX, SFO, SEA, LAS, ATL, MIA, MCO, IAD, EWR, JFK and BOS in the United States; BGI, TAB, HAV, MBJ, ANU, GND and UVF in the Caribbean; BOM, DEL, PVG and HKG in Asia; LOS and JNB in Africa, TLV in Israel, and GRU in Brazil ((flights begin on 29 March 2020). Check out our best fares and last-minute deals today and snap up a bargain flight.

Categories: Our Places