European Union Emissions Trading System (EU ETS) data from EUTL

core

Files Size Format Created Updated License Source
2 47MB csv zip 2 months ago ODC-PDDL-1.0 EU ETS data
Data about the EU emission trading system (ETS). The EU emission trading system (ETS) is one of the main measures introduced by the EU to achieve cost-efficient reductions of greenhouse gas emissions and reach its targets under the Kyoto Protocol and other commitments. The data mainly comes from read more
Download

Data Files

File Description Size Last changed Download Other formats
eu-ets [csv] 5MB eu-ets [csv] eu-ets [json] (5MB)
eu-emissions-trading-system_zip [zip] Compressed versions of dataset. Includes normalized CSV and JSON data with original data and datapackage.json. 841kB eu-emissions-trading-system_zip [zip]

eu-ets  

This is a preview version. There might be more data in the original version.

Field information

Field Name Order Type (Format) Description
country_code 1 string International Country Code (ISO 3166-1-Alpha-2 code elements)
country 2 string Country name
main activity sector name 3 string Main activity label
ETS information 4 string ETS information
year 5 string Annual data mainly in YYYY format, but also may include stings Eg: Total 1st trading period (05-07)
value 6 number measure value
unit 7 string Unit of the measure value (in tonne of CO2-equ.)

eu-emissions-trading-system_zip  

This is a preview version. There might be more data in the original version.

Read me

Data about the EU emission trading system (ETS). The EU emission trading system (ETS) is one of the main measures introduced by the EU to achieve cost-efficient reductions of greenhouse gas emissions and reach its targets under the Kyoto Protocol and other commitments. The data mainly comes from the EU Transaction Log (EUTL).

Data

Aggregated data on greenhouse gas emissions and allowances.

Geographic coverage

Austria, Belgium, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy, Latvia, Liechtenstein, Lithuania, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, Sweden, United Kingdom

Temporal coverage

2005-2014

Sources

Data Preparation

Requirements

Python 2 together with modules urllib and zipfile are required in order to process the data.

Processing

Run the following script from this directory to download and process the data:

make

Resources

The raw data are output to ./tmp. The processed data are output to ./data.

License

Data

Data are sourced from European Environment Agency and no copyright restrictions are applied. More specifically:

EEA aspires to promote the sharing of environmental data. In agreeing to share, data providers need to have assurance that their data are properly handled, disseminated and acknowledged following similar principles and rules across countries and stakeholders.*

Additional work

All the additional work done to build this Data Package is made available under the Public Domain Dedication and License v1.0 whose full text can be found at: http://www.opendatacommons.org/licenses/pddl/1.0/

Citations

  1. EEA standard re-use policy: unless otherwise indicated, re-use of content on the EEA website for commercial or non-commercial purposes is permitted free of charge, provided that the source is acknowledged (http://www.eea.europa.eu/legal/copyright). Copyright holder: Directorate-General for Climate Action (DG-CLIMA).

Import into your tool

In order to use Data Package in R follow instructions below:

install.packages("devtools")
library(devtools)
install_github("hadley/readr")
install_github("ropenscilabs/jsonvalidate")
install_github("ropenscilabs/datapkg")

#Load client
library(datapkg)

#Get Data Package
datapackage <- datapkg_read("https://pkgstore.datahub.io/core/eu-emissions-trading-system/latest")

#Package info
print(datapackage)

#Open actual data in RStudio Viewer
View(datapackage$data$"eu-ets")
View(datapackage$data$"eu-emissions-trading-system_zip")

Tested with Python 3.5.2

To generate Pandas data frames based on JSON Table Schema descriptors we have to install jsontableschema-pandas plugin. To load resources from a data package as Pandas data frames use datapackage.push_datapackage function. Storage works as a container for Pandas data frames.

In order to work with Data Packages in Pandas you need to install our packages:

$ pip install datapackage
$ pip install jsontableschema-pandas

To get Data Package run following code:

import datapackage

data_url = "https://pkgstore.datahub.io/core/eu-emissions-trading-system/latest/datapackage.json"

# to load Data Package into storage
storage = datapackage.push_datapackage(data_url, 'pandas')

# to see datasets in this package
storage.buckets

# you can access datasets inside storage, e.g. the first one:
storage[storage.buckets[0]]

In order to work with Data Packages in Python you need to install our packages:

$ pip install datapackage

To get Data Package into your Python environment, run following code:

import datapackage

dp = datapackage.DataPackage('https://pkgstore.datahub.io/core/eu-emissions-trading-system/latest/datapackage.json')

# see metadata
print(dp.descriptor)

# get list of csv files
csvList = [dp.resources[x].descriptor['name'] for x in range(0,len(dp.resources))]
print(csvList) # ["resource name", ...]

# access csv file by the index starting 0
print(dp.resources[0].data)

To use this dataset in JavaScript, please, follow instructions below:

Install data.js module using npm:

  $ npm install data.js

Once the package is installed, use code snippet below:

  const {Dataset} = require('data.js')

  const path = 'https://pkgstore.datahub.io/core/eu-emissions-trading-system/latest/datapackage.json'

  const dataset = Dataset.load(path)

  // get a data file in this dataset
  const file = dataset.resources[0]
  const data = file.stream()

In order to work with Data Packages in SQL you need to install our packages:

$ pip install datapackage
$ pip install jsontableschema-sql
$ pip install sqlalchemy

To import Data Package to your SQLite Database, run following code:

import datapackage
from sqlalchemy import create_engine

data_url = 'https://pkgstore.datahub.io/core/eu-emissions-trading-system/latest/datapackage.json'
engine = create_engine('sqlite:///:memory:')

# to load Data Package into storage
storage = datapackage.push_datapackage(data_url, 'sql', engine=engine)

# to see datasets in this package
storage.buckets

# to execute sql command (assuming data is in "data" folder, name of resource is data and file name is data.csv)
storage._Storage__connection.execute('select * from data__data___data limit 1;').fetchall()

# description of the table columns
storage.describe('data__data___data')
Datapackage.json