-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
taxonomy_mapping() does not find taxonomy column names when using latest docker image #30
Comments
Thanks for reporting this error! Starting with the latest scrattch-mapping docker (bicore/scrattch_mapping:latest). I wasn't able to recreate this issue from our test cases, so you've found some fun edge case. To be complete, I also tried loading I'll need some additional info to figure out what's going on. A few questions:
|
FYI @scseeman |
That's interesting that you can find the column names when you load with In the latest version:
In 0.16 the opposite is true:
|
I was able to consistently retrieve anndata$uns$clusterInfo as a Somehow the version of anndata (R library) was downgraded in the latest scrattch mapping docker. I suspect this is the culprit: bicore/scrattch_mapping:latest -- anndata_0.7.5.3 |
@meghanaturner @UCDNJJ a couple of weeks ago I was having issues with |
Indeed, 0.16 has anndata="0.7.5.6" and latest has anndata="0.7.5.3" It looks like R was also downgraded: 0.16 sessionInfo():
latest sessionInfo():
|
Hi @meghanaturner, when you have time can you check that this issues is resolved when using this docker image:
Forewarning, quite a few changes exist in this new update. So if you hit an error let us know. |
Hi @UCDNJJ, this docker image seems to have fixed the original issue I reported where the column names weren't found |
@UCDNJJ However,
The same error is thrown for a dgCMatrix. The workaround is to only use taxonomy and spatial anndata objects where X is a dense matrix. As an alternative to read_h5ad, I tried using
but despite the documentation for the taxonomyDir argument suggesting that it supports direct h5ad files that aren't part of a shiny taxonomy folder, it errors out with: I saw that you split off scrattch-taxonomy, including loadTaxonomy(), from scrattch-mapping into it's own repo. Should I raise this issue over there? |
Interesting, we definitely don't want to be using dense matrices all the time! Let's leave this issue here for now. We need to do a better job with documentation but you should always use I took a quick look and Also, can see if you can run the tutorial without error: mapping |
In attempting to follow the build_taxonomy tutorial, I am unable to load the counts matrix from the taxonomy I'm using into R. I am not familiar with R, so I'm not sure what R's anndata package is expecting to find in an ad.X stored as a CSR sparse matrix. And the tutorial does not provide any suggestions of how to read in counts matrices from other h5ad files (it just does
Error:
|
So I also tried running through your code both in a separate R environment and within the scrattch.mapping docker. Both produced the same error. Could this error be arising due to the dataset size or some change in the .h5ad file that happened a few weeks ago. I can use the same approach you shared with a dataset or ~340k cells and ~22k genes: /allen/programs/celltypes/workgroups/rnaseqanalysis/shiny/10x_seq/NHP_BG_AIT_115/NHP_BG_AIT115_complete.h5ad. R successfully reads in the anndata$X as a dgR sparse matrix. |
Using the latest scrattch-mapping docker image release leads to an error on line 22 of
R/taxonomy_mapping()
becausecolnames(AIT.anndata$uns$clusterInfo)
returns NULL.This issue can be fixed by switching back to the 0.16 version of the docker image. Using 0.16 and the exact same taxonomy h5ad file (//allen/programs/celltypes/workgroups/rnaseqanalysis/mFISH/meghanturner/brain3_mapping/taxonomies/AIT17.0.logCPM.sampled100_MERSCOPE_BRAIN3_GENES_dense_for_mapping.h5ad),
colnames(AIT.anndata$uns$clusterInfo)
returns the expected list of column names and the code runs as it should.@berl @egelfan2
The text was updated successfully, but these errors were encountered: