A tool for converting Wikidata dumps to a SurrealDB database. Either From a bz2 or json file.
Find a file
2024-02-07 16:08:18 -08:00
.cargo with_capacity, fix progress bar, profiling 2024-01-30 23:07:19 -08:00
.devcontainer init 2023-12-13 01:04:51 +00:00
.github/workflows tests 2023-12-17 20:49:29 -08:00
.vscode mulit line json 2023-12-13 15:20:48 -08:00
benches match CREATE_MODE 2024-02-07 16:08:18 -08:00
src match CREATE_MODE 2024-02-07 16:08:18 -08:00
tests bulk insert and benchmarks 2024-01-21 16:10:05 -08:00
.dockerignore bulk insert and benchmarks 2024-01-21 16:10:05 -08:00
.gitignore bulk insert and benchmarks 2024-01-21 16:10:05 -08:00
Cargo.toml with_capacity, fix progress bar, profiling 2024-01-30 23:07:19 -08:00
CONTRIBUTING.md match CREATE_MODE 2024-02-07 16:08:18 -08:00
docker-compose.dev.yml match CREATE_MODE 2024-02-07 16:08:18 -08:00
docker-compose.yml match CREATE_MODE 2024-02-07 16:08:18 -08:00
Dockerfile tests 2023-12-17 20:49:29 -08:00
LICENSE-Apache bulk insert and benchmarks 2024-01-21 16:10:05 -08:00
LICENSE-MIT bulk insert and benchmarks 2024-01-21 16:10:05 -08:00
README.md match CREATE_MODE 2024-02-07 16:08:18 -08:00
Useful queries.md tests 2023-12-17 20:49:29 -08:00

Wikidata to SurrealDB

A tool for converting Wikidata dumps to a SurrealDB database. Either From a bz2 or json file.

The surrealdb database is ~2.6GB uncompressed or 0.5GB compressed, while the bz2 file is ~80GB, gzip file is ~130GB, and the uncompressed json file is over 1TB.

Building the database on a 7600k takes ~55 hours, using ThreadedSingle, using a cpu with more cores should be faster.

Getting The Data

https://www.wikidata.org/wiki/Wikidata:Data_access

From bz2 file ~80GB

Dump: Docs

Download - latest-all.json.bz2

From json file

Linked Data Interface: Docs

https://www.wikidata.org/wiki/Special:EntityData/Q60746544.json
https://www.wikidata.org/wiki/Special:EntityData/P527.json

Install

Copy docker-compose.yml

Create data folder next to docker-compose.yml and .env, place data inside, and set the data type in .env

├── data
│   ├── Entity.json
│   ├── latest-all.json.bz2
│   └── surrealdb
├── docker-compose.yml
└── .env

docker compose up --pull always -d

View Progress

docker attach wikidata-to-surrealdb

Example .env

DB_USER=root
DB_PASSWORD=root
WIKIDATA_LANG=en
WIKIDATA_FILE_FORMAT=bz2
WIKIDATA_FILE_NAME=data/latest-all.json.bz2
# If not using docker file for Wikidata to SurrealDB, use 0.0.0.0:8000
WIKIDATA_DB_PORT=surrealdb:8000
# true=overwrite existing data, false=skip if already exists
OVERWRITE_DB=false
CREATE_MODE=ThreadedSingle

Env string CREATE_MODE must be in the enum CreateMode

pub enum CreateMode {
    Single,
    ThreadedSingle,
    ThreadedBulk, // Buggy
}

Dev Install

How to Query

namespace = wikidata
database = wikidata

See Useful queries.md

Table Schema

SurrealDB Thing

pub struct Thing {
    pub table: String,
    pub id: Id, // i64
}

Tables: Entity, Property, Lexeme

pub struct EntityMini {
    pub id: Option<Thing>,
    pub label: String,
     // Claims Table
    pub claims: Thing,
    pub description: String,
}

Table: Claims

pub struct Claim {
    pub id: Thing,
    pub value: ClaimData,
}

ClaimData

pub enum ClaimData {
    // Entity, Property, Lexeme Tables
    Thing(Thing), 
    ClaimValueData(ClaimValueData),
}

Docs for ClaimValueData

Similar Projects

License

All code in this repository is dual-licensed under either License-MIT or LICENSE-APACHE at your option. This means you can select the license you prefer. Why dual license.