This is an old revision of this dataset, as edited at Aug 25, 2010, 14:32. It may differ significantly from the current revision.



From the front page:

" is a community effort to extract structured information from Wikipedia and to make this information available on the Web. DBpedia allows you to ask sophisticated queries against Wikipedia and to link other datasets on the Web to Wikipedia data."


Data exposed: Data set containing extracted data from Wikipedia. About 3.4 million concepts described by 1 billion triples, including abstracts in 92 different languages

Size of dump and data set: 1 billion triples

Data and Resources

Additional Info

Polje Vrednost
Autor DBpedia Team
Verzijа 3.5.1
links:2000-us-census-rdf 12529
links:dbtune-musicbrainz 22981
links:flickr-wrappr 2298849
links:freebase 6943755
links:fu-berlin-dailymed 43
links:fu-berlin-dblp 196
links:fu-berlin-diseasome 1943
links:fu-berlin-drugbank 729
links:fu-berlin-gutenberg 2510
links:fu-berlin-sider 751
links:geonames 86547
links:linkedgeodata 53024
links:nytimes-linked-open-data 10359
links:opencyc 20362
links:rdf-book-mashup 9078
links:revyu 6
links:tcmgenedit_dataset 904
links:wikicompany 8348
links:wordnet 467101
links:world-factbook-fu-berlin 233
links:yago 6535093
triples 1000000000