wikidata | A PHP client for working with Wikidata API | REST library
kandi X-RAY | wikidata Summary
kandi X-RAY | wikidata Summary
Wikidata provides a API for searching and retrieving data from wikidata.org.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Searches for entities by property ID .
- Execute a SPARQL query .
- Retrieve an entity
- Get entities .
- Extract data from the head
- Search entities .
- Parse the data .
- Get array representation .
- Parse the properties .
wikidata Key Features
wikidata Examples and Code Snippets
$entity = $wikidata->get($entityId, $lang);
// Get all data about Steve Jobs
$entity = $wikidata->get('Q19837');
/*
Entity {
id: "Q19837"
lang: "en"
label: "Steve Jobs"
wiki_url: "https://en.wikipedia.org/wiki/Steve_Jobs"
$results = $wikidata->searchBy($propId, $entityId, $lang, $limit);
// List of people who born in city Pomona, US
$results = $wikidata->searchBy('P19', 'Q486868');
/*
Collection {
#items: array:10 [
0 => SearchResult {
i
$results = $wikidata->search($query, $lang, $limit);
$results = $wikidata->search('car', 'fr', 5);
/*
Collection {
#items: array:5 [
0 => SearchResult {
id: "Q1759802"
lang: "fr"
label: "autocar"
Community Discussions
Trending Discussions on wikidata
QUESTION
i would like to retrieve properties of a wikidata entry (eg I want to retrieve date of birth (P569) of Donald Trump (Q22686)). I tried to use wbgetentities as action but failed to retrieve more than the description of the wikidata entry. Is it possible to retrieve the properties with wbgetentities?
...ANSWER
Answered 2021-Jun-15 at 12:57what you're looking for is props=claims
: https://www.wikidata.org/w/api.php?action=wbgetentities&props=claims&ids=Q66505&format=json
QUESTION
I have a REST API of classical actors that I want to visualize in Postman. The image URL of an actor is not in the API, so I will need to create a mashup from a combination of the core API and another API.
1. PrerequisitesThe core API/endpoint is at http://henke.atwebpages.com/postman/actors/actors.json:
...ANSWER
Answered 2021-Jun-04 at 16:27The message Set up the visualizer for this request is typical when the call to
pm.visualizer.set()
has been forgotten. But I did not forget it. So what is wrong?
As already touched upon, the problem is that
Postman does not natively support promises.
1
What does that mean? – Well, apparently it means that a function such as
pm.visualizer.set()
cannot be called from within the callback of a
Promise.
It has to be called from within the callback of pm.sendRequest()
.
Note that by the construction of the fetch()
function the corresponding
Promise is actually outside of the pm.sendRequest()
callback!
In other words, you need to replace all occurrences of fetch()
with
pm.sendRequest()
.
You also need to implement your own version of Promise.all
, since it relies
upon promises, something you don't have in a native Postman script.
Fortunately, such an implementation was posted in
an answer the day before yesterday.
After making those changes, here is the code for the Tests section, starting with the initializations: 2
QUESTION
I am new on this side, the question-asking side, so please tell me if you need any additional information.
I have a dataset with 2900 entries consisting mostly Dutch and Flemish poets. I want to add information to this dataframe by querying wikidata; gender, nationality, day of birth, day of death. Now how many poets can two small countries have? Not all of them are to be found on wikidata (I'm going to take care of that later), and for the ones that are, the info is sometimes very scarce.
I have used the following query:
...ANSWER
Answered 2021-Jun-03 at 07:24The intuition of using OPTIONAL
is correct. You have to add it for every single information that you want to consider optional (i.e. not necessary).
Furthermore, for avoiding false positives, I think you should also use rdfs:label
instead of a generic ?label
(which can refer to any property).
QUESTION
I'm trying to parse this .txt file in R: https://ftp.expasy.org/databases/cellosaurus/cellosaurus.txt
It's essentially a single column data frame of some ~2 million rows, with each entity being described by multiple rows and bookended by rows containing the string "//".
Ideally, I could capture each entity, made up of multiple rows, as a list element by splitting at "//", but I'm not sure of the most efficient way to go about this.
Any help is much appreciated.
EDIT:
Here's a snippet of what I'm working with:
...ANSWER
Answered 2021-Jun-02 at 11:06Here is one solution using data.table
.
QUESTION
I am trying to recreate this list:
https://en.wikipedia.org/wiki/List_of_states_and_territories_of_the_United_States_by_GDP
with a Wikidata SPARQL query.
I can find states by population with this query
Additionally, the fields:
- population (P1082)
- GDP (P2131)
- And some extra ones, like unemployment (P1198)
are covered by the wikiproject economics, though only at the country level.
That said, seeing the "List of states and territories of the United States by GDP" article makes me think at least P2131 may be available at the state level.
I have tried the following query.
...ANSWER
Answered 2021-May-18 at 13:14Because of a Wikidata internal convention, I had to upload the GPD data in the items about the States' economies, that are linked through property P8744.
E.g., for the State of Maine you'll find the data in economy of Maine.
This is the correct query for obtaining what you want (test):
QUESTION
I have the following query:
...ANSWER
Answered 2021-May-07 at 09:46This query seems to work quite well for me: Edited answer:
QUESTION
I am running an Apache Jena Fuseki server als the SPARQL endpoint that I can connect to when using the application normally. Everything works and I get the output from the resulting query.
But When I try to run my test with Springboot, Junit5 (I assume) and MockMVC it always get stuck on the following part:
...ANSWER
Answered 2021-May-16 at 11:27The answer I found was that the heap size was constantly overflowing. Adding the line:
QUESTION
I create directed graphs like the following from wikidata with the help of networkx and nxv. The result is an svg file which might be embedded in some html page.
Now I want that every node and every edge is "clickable", such that a user can add their comments to specific elements of the graph. I think this could be done with a modal dialog popping up. This dialog should know from which element it was triggered and it should send the content of the textarea to some url via a post request.
What would be the best way to achieve this?
...ANSWER
Answered 2021-Feb-23 at 13:29As far as I know, nxv generates a g
element with class "node" for each node, all nested inside a graph g
. So basically you could loop over all g
s elements inside the main group and attach a click event listener on each one. (actually, depending of the desired behavior, you might want to attach the event listener to the shape inside the g, as done below. For the inside of the shape to be clickable, it has to be filled)
On click, it would update a form
, to do several things: update its style to show it as a modal (when submitted, the form should go back to hiding), and update an hidden input with the text
content of the clicked g
.
Basically it would be something like that:
QUESTION
I want to login to Wikidata using their API: https://www.wikidata.org/w/api.php
I had prepared a few requests and tried them against the test instance of Wikidata: https://test.wikidata.org/w/api.php. Everything worked fine and I changed the call to target the real Wikidata instead. But now the action clientlogin
won't work, even though the settings are exactly the same as for the test instance. I have looked for documentation, but none seem to describe any differences between the test and the real instance.
I'm using Postman for making the POST requests. I have the parameters:
...ANSWER
Answered 2021-May-07 at 15:06While writing this question, I realized that the error was that when removing the subdomain test from the URL, I was supposed to replace it with www for the real Wikidata... 🤦🏻♀️ But now it works and hopefully, someone else can make use of this answer.
QUESTION
I'm looking to recreate this list of cities in Texas by population using wikidata.
I see I can do states by population with this query:
...ANSWER
Answered 2021-May-05 at 15:55The issue is that Houston and San Antonio's locations are listed as Harris and Bexar county respectively, and the counties are located in Texas. If you try this query it should work:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install wikidata
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page