As of QuPath 0.6.0-rc2, there is the possiblity to use InstanSeg to segment cells
As it is based on torch, there is a hefty configuration to get started
mamba
environmentmamba create -n qupath-pytorch
mamba activate qupath-pytorch
conda install pytorch==2.4.0 torchvision==0.19.0 torchaudio==2.4.0 pytorch-cuda=12.4 -c pytorch -c nvidia
Extensions > Deep Java Library > Create launch script
Set the following:
Parameter | Value |
---|---|
Conda environment |
D:\conda\conda-envs\qupath-pytorch |
PyTorch version |
2.4.0 |
PyTorch path |
D:\conda\conda-envs\qupath-pytorch\lib\site-packages\torch\lib |
QuPath executable |
"C:\QuPath-v0.6.0-rc2\QuPath-0.6.0-rc2 (console).exe" |
Add set PYTORCH_FLAVOR=cpu
otherwise the GPU doesn't work
Also add D:\conda\conda-envs\qupath-pytorch\lib\site-packages\torch\lib
and D:\conda\conda-envs\qupath-pytorch\Library\bin
to the PATH
In the end it should look like this
set PYTORCH_VERSION=2.4.0
set PYTORCH_LIBRARY_PATH=D:\conda\conda-envs\qupath-pytorch\lib\site-packages\torch\lib
-set PATH=D:\conda\conda-envs\qupath-pytorch;D:\conda\conda-envs\qupath-pytorch\bin;D:\conda\conda-envs\qupath-pytorch\lib;D:\conda\conda-envs\qupath-pytorch;D:\conda\conda-envs\qupath-pytorch\bin;D:\conda\conda-envs\qupath-pytorch\lib;%PATH%
+set PATH=D:\conda\conda-envs\qupath-pytorch\lib\site-packages\torch\lib;D:\conda\conda-envs\qupath-pytorch\Library\bin;D:\conda\conda-envs\qupath-pytorch;D:\conda\conda-envs\qupath-pytorch\bin;D:\conda\conda-envs\qupath-pytorch\lib;%PATH%
+set PYTORCH_FLAVOR=cpu
"C:\QuPath-v0.6.0-rc2\QuPath-0.6.0-rc2 (console).exe"
qupath.ext.instanseg.core.InstanSeg.builder()
.modelPath("/path/to/some/model")
.device("mps")
.nThreads(4)
.tileDims(512)
.interTilePadding(32)
.inputChannels([ColorTransforms.createChannelExtractor("Red"), ColorTransforms.createChannelExtractor("Green"), ColorTransforms.createChannelExtractor("Blue")])
.outputChannels()
.makeMeasurements(true)
.randomColors(false)
.build()
.detectObjects()