FastRun for new items #522
-
|
I'm using WBI for building out a couple of WB instances on wikibase.cloud. I love the simplicity and elegance of the WBI design and approach over pywikibot. I am struggling a bit with pushing large numbers of items into an instance. I've worked up some parallel methods using Dask, but those run into issues with too many logins coming from different machines depending on how I throttle things. I've looked into RaiseWikibase, which takes a whole different route for building items that I haven't worked out yet against a wikibase.cloud instance, but it does look like I could use the existing WBI process for building item structures and then send their JSON to RaiseWikibase. All the examples I've seen using the fast run stuff with WBI (e.g., https://github.com/LeMyst/wd-population/blob/master/departements_fastrun.py) are operating on existing items where the base_filter makes sense. Does someone have an example of writing new items with wbi_fastrun? I will always have a combination of an "instance of" classification and likely some other standard property across a batch of new items, but I'm not clear on how this would work. Alternatively, any examples using a combination of WBI and RaiseWikibase would also be appreciated. Thank you. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
|
Hello @skybristol , Sorry for the late answer. Maybe you can use the maintenance script to quickly import data in your Wikibase/Mediawiki instance, and then export everything into the SPARQL Server. |
Beta Was this translation helpful? Give feedback.
Hello @skybristol ,
Sorry for the late answer.
The wbi_fastrun is made to quickly check if an item, with some properties, already exist or not in the Wikibase Instance. It use the SPARQL endpoint to query.
Where wbi_fastrun can help you, is to determine if you need, or not, to create or update (and maybe delete) an item in the instance.
But for the part of create/update/delete, you use the Wikibase API, and the limitations inherited.
Maybe you can use the maintenance script to quickly import data in your Wikibase/Mediawiki instance, and then export everything into the SPARQL Server.