ISO Language Codes (639-1 and 693-2) and IETF Language Types

core

Files Size Format Created Updated License Source
5 507kB csv zip 2 months ago public_domain_dedication_and_license Library of Congress Unicode
Comprehensive language code information, consisting of ISO 639-1, ISO 639-2 and IETF language types. Data Data is taken from the Library of Congress as the ISO 639-2 Registration Authority, and from the Unicode Common Locale Data Repository. data/language-codes.csv This file contains the 184 read more
Download

Data Files

language-codes  

This is a preview version. There might be more data in the original version.

Field information

Field Name Order Type (Format) Description
alpha2 1 string 2 letter alpha-2 code
English 2 string English name of language

language-codes-3b2  

This is a preview version. There might be more data in the original version.

Field information

Field Name Order Type (Format) Description
alpha3-b 1 string 3 letter alpha-3 bibliographic code
alpha2 2 string 2 letter alpha-2 code
English 3 string English name of language

language-codes-full  

This is a preview version. There might be more data in the original version.

Field information

Field Name Order Type (Format) Description
alpha3-b 1 string 3 letter alpha-3 bibliographic code
alpha3-t 2 string 3 letter alpha-3 terminologic code (when given)
alpha2 3 string 2 letter alpha-2 code (when given)
English 4 string English name of language
French 5 string French name of language

ietf-language-tags  

This is a preview version. There might be more data in the original version.

Field information

Field Name Order Type (Format) Description
lang 1 string IANA/Unicode language-tag-extension
langType 2 string ISO 2 letter alpha-2 language code
territory 3 string ISO3166-1-Alpha-2 country code
revGenDate 4 string revision date (format ISO data)
defs 5 integer number of definitions
dftLang 6 boolean indicate the default-language, as unicode-cldr
file 7 string file-name of the locale descriptor

language-codes_zip  

This is a preview version. There might be more data in the original version.

Read me

Comprehensive language code information, consisting of ISO 639-1, ISO 639-2 and IETF language types.

Data

Data is taken from the Library of Congress as the ISO 639-2 Registration Authority, and from the Unicode Common Locale Data Repository.

data/language-codes.csv

This file contains the 184 languages with ISO 639-1 (alpha 2 / two letter) codes and their English names.

data/language-codes-3b2.csv

This file contains the 184 languages with both ISO 639-2 (alpha 3 / three letter) bibliographic codes and ISO 639-1 codes, and their English names.

data/language-codes-full.csv

This file is more exhaustive.

It contains all languages with ISO 639-2 (alpha 3 / three letter) codes, the respective ISO 639-1 codes (if present), as well as the English and French name of each language.

There are two versions of the three letter codes: bibliographic and terminologic. Each language has a bibliographic code but only a few languages have terminologic codes. Terminologic codes are chosen to be similar to the corresponding ISO 639-1 two letter codes.

Example from Wikipedia:

[…] the German language (Part 1: de) has two codes in Part 2: ger (T code) and deu (B code), whereas there is only one code in Part 2, eng, for the English language.

There are four special codes: mul, und, mis, zxx; and a reserved range qaa-qtz.

data/ietf-language-tags.csv

This file lists all IETF language tags of the official resource indicated by http://www.iana.org/assignments/language-tag-extensions-registry that into the /main folder of http://www.unicode.org/Public/cldr/latest/core.zip (project cldr.unicode.org).

Preparation

This package includes a bash script to fetch current language code information and adjust the formatting. The file ietf-language-tags.csv is obtained with ietf-lanGen.php.

License

This material is licensed by its maintainers under the Public Domain Dedication and License (PDDL).

Nevertheless, it should be noted that this material is ultimately sourced from the Library of Congress as a Registration Authority for ISO and their licensing policies are somewhat unclear. As this is a short, simple database of facts, there is a strong argument that no rights can subsist in this collection.

However, if you intended to use these data in a public or commercial product, please check the original sources for any specific restrictions.

Import into your tool

In order to use Data Package in R follow instructions below:

install.packages("devtools")
library(devtools)
install_github("hadley/readr")
install_github("ropenscilabs/jsonvalidate")
install_github("ropenscilabs/datapkg")

#Load client
library(datapkg)

#Get Data Package
datapackage <- datapkg_read("https://pkgstore.datahub.io/core/language-codes/latest")

#Package info
print(datapackage)

#Open actual data in RStudio Viewer
View(datapackage$data$"language-codes")
View(datapackage$data$"language-codes-3b2")
View(datapackage$data$"language-codes-full")
View(datapackage$data$"ietf-language-tags")
View(datapackage$data$"language-codes_zip")

Tested with Python 3.5.2

To generate Pandas data frames based on JSON Table Schema descriptors we have to install jsontableschema-pandas plugin. To load resources from a data package as Pandas data frames use datapackage.push_datapackage function. Storage works as a container for Pandas data frames.

In order to work with Data Packages in Pandas you need to install our packages:

$ pip install datapackage
$ pip install jsontableschema-pandas

To get Data Package run following code:

import datapackage

data_url = "https://pkgstore.datahub.io/core/language-codes/latest/datapackage.json"

# to load Data Package into storage
storage = datapackage.push_datapackage(data_url, 'pandas')

# to see datasets in this package
storage.buckets

# you can access datasets inside storage, e.g. the first one:
storage[storage.buckets[0]]

In order to work with Data Packages in Python you need to install our packages:

$ pip install datapackage

To get Data Package into your Python environment, run following code:

import datapackage

dp = datapackage.DataPackage('https://pkgstore.datahub.io/core/language-codes/latest/datapackage.json')

# see metadata
print(dp.descriptor)

# get list of csv files
csvList = [dp.resources[x].descriptor['name'] for x in range(0,len(dp.resources))]
print(csvList) # ["resource name", ...]

# access csv file by the index starting 0
print(dp.resources[0].data)

To use this dataset in JavaScript, please, follow instructions below:

Install data.js module using npm:

  $ npm install data.js

Once the package is installed, use code snippet below:

  const {Dataset} = require('data.js')

  const path = 'https://pkgstore.datahub.io/core/language-codes/latest/datapackage.json'

  const dataset = Dataset.load(path)

  // get a data file in this dataset
  const file = dataset.resources[0]
  const data = file.stream()

In order to work with Data Packages in SQL you need to install our packages:

$ pip install datapackage
$ pip install jsontableschema-sql
$ pip install sqlalchemy

To import Data Package to your SQLite Database, run following code:

import datapackage
from sqlalchemy import create_engine

data_url = 'https://pkgstore.datahub.io/core/language-codes/latest/datapackage.json'
engine = create_engine('sqlite:///:memory:')

# to load Data Package into storage
storage = datapackage.push_datapackage(data_url, 'sql', engine=engine)

# to see datasets in this package
storage.buckets

# to execute sql command (assuming data is in "data" folder, name of resource is data and file name is data.csv)
storage._Storage__connection.execute('select * from data__data___data limit 1;').fetchall()

# description of the table columns
storage.describe('data__data___data')
Datapackage.json