Error Accessing BiblioEntry data API

Running the quick start python script I am not able to run the BiblioEntry data, In line 29, I am getting an error related to the API. Does any body know how to get the idea on how to filter by journal type?

Exception Traceback (most recent call last)
5 {
6 “spec” : {
----> 7 “filter” : “hasFullText == true”
8 }
9 }

~/ in fetch(typename, body, get_all, remove_meta)
53 else:
—> 54 response_json = read_data_json(typename, ‘fetch’, body)
55 df = pd.json_normalize(response_json[‘objs’])

~/ in read_data_json(typename, api, body)
22 # if request failed, show exception
23 if response.status_code != 200:
—> 24 raise Exception(response.json()[“message”])
26 return response.json()

Exception: wrapped RuntimeException: Field ‘hasFullText’ does not exist in expression: ‘meta.tenantTagId == 4 && (hasFullText == true)’ for type ‘BiblioEntry’.

Getting the same error. It seems that hasFullText is a nullable field and nullable fields in general cause this kind of error.

But then I’ve also tried looping through all BiblioEntry objects via fetch, and non of the results returned with a hasFullText column. Maybe something changed since the document was last updated?

@mvgarcia and @joy13975, thanks for bringing this to our attention. This bug has been resolved and the hasFullText column is functioning as expected. Please note that only a subset of articles from the CORD-19 dataset have full text currently available.


Thanks for the confirmation.

With further testing I’m seeing that some papers, even though marked with hasFullText==True, actually do not have full text available from the biblioentry/getarticlemetadata endpoint.

For instance, BiblioEntry paper id “001wbz6e” should have full text, but getarticlemetadata returns nothing.

Hi @joy13975, thanks for noting this. It looks like some articles were incorrectly marked with “hasFullText == true”, but all articles accessed via fetch with filter “(hasFullText == true) && exists(abstractText)” have full text available via getArticleMetadata. We’ll take a look to see why this is the case, but please use this filter in the meantime.