Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Maximum amount of pairs and batch processing #235

Closed
frederikvand opened this issue May 22, 2020 · 25 comments
Closed

Maximum amount of pairs and batch processing #235

frederikvand opened this issue May 22, 2020 · 25 comments

Comments

@frederikvand
Copy link

frederikvand commented May 22, 2020

Dear Julia and Circuitscape associates,

When using a pairs file with (and without) short circuits the pairs (and parallel arguments) are ignored. When the pairs file is included it starts to solve all possible pairs (5601952476) based on 105848 habitat patches (or centroids). Example subset to reproduce the problem can be found below.

The pairs file is formatted as:
mode include
4 2
7 2
6 3
1 5

Not running in parallel
Also, when I use the short circuit file combined with the centroid point file, the parallel processing doesn't seem to work:

[ Info: 2020-05-26 12:07:38 : Logs will recorded to file: log_file
[ Info: 2020-05-26 12:07:38 : Precision used: Double
[ Info: 2020-05-26 12:07:43 : Reading maps
[ Info: 2020-05-26 12:07:49 : Resistance/Conductance map has 10893750 nodes
[ Info: 2020-05-26 12:08:01 : Total number of pair solves = 10743930
[ Info: 2020-05-26 12:08:01 : Solving pair 1 of 10743930
[ Info: 2020-05-26 12:08:12 : Solver used: CHOLMOD
[ Info: 2020-05-26 12:08:12 : Graph has 10893705 nodes, 2 focal points and 1 connected components
[ Info: 2020-05-26 12:09:04 : Time taken to construct cholesky factor = 50.789335301
[ Info: 2020-05-26 12:09:06 : Time taken to construct local nodemap = 1.766986406 seconds
[ Info: 2020-05-26 12:09:06 : Solving points 1 to 1
[ Info: 2020-05-26 12:09:10 : Solving pair 2 of 10743930

Output overwrites itself
While its solving sequentially it is also overwriting its own output, thus only resistances between one pair stays visable.

Output after 5 min:
0.0 5.0 14.0
5.0 0.0 0.0011597292571570218
14.0 0.0011597292571570218 0.0

Output after10 min:
0.0 5.0 17.0
5.0 0.0 0.0011460580515374435
17.0 0.0011460580515374435 0.0

So, currently it solves only pairwise comparisons of all pairs related to habitat patch 5 and overwrites its own output.

Extra info:
I successfully calibrated this resistance map with a genetic optimisation algorithm (400 scenarios and 400 pairs) but now I want to measure resistances between all habitat patches in Europe. I made a simplified problem (downsampled to 400m resolution) with 127093750 nodes and 105848 habitat patches. If possible I would like to calculate my final problem on 2033500000 nodes and aprox. 436945 habitat patches for 16 scenarios.

I want to use use polygons (asc) as shortcircuits with a centroid point file (asc) with an additional pairs file (txt). I tested the script with and without asc/tif, CG+AMG/ cholmod, single/parallel, with and without short circuits and 32/64bit_indexing. Normally I write my patches to asc grid with int2s but currently there are more pairs than this integer type can handle. Therefore the asc grid was created using the int4s integer type (https://www.rdocumentation.org/packages/raster/versions/3.1-5/topics/dataType) and NODATA values were forced to -9999. The final habitat file has 105848 habitat patches with a unique ID and the pairs file has 79829 unique pairs. Am I missing something that could cause this behavior?

All asc rastes line up with the following dimensions: dimensions : 10375, 8750, 90781250 (nrow, ncol, ncell), extent : 2500000, 6000000, 1350000, 5500000 (xmin, xmax, ymin, ymax)

Thank you for your insights!

@frederikvand frederikvand changed the title Large amount of habitat patches (polygons) with a pairs file. Unable to read .tif May 24, 2020
@frederikvand frederikvand changed the title Unable to read .tif ERROR: LoadError: Base.InvalidCharError{Char}('\x87') May 24, 2020
@frederikvand frederikvand changed the title ERROR: LoadError: Base.InvalidCharError{Char}('\x87') Large pairs file: ERROR: LoadError: MethodError: no method matching Circuitscape.RasData(::Array{Float64,2} May 25, 2020
@frederikvand frederikvand changed the title Large pairs file: ERROR: LoadError: MethodError: no method matching Circuitscape.RasData(::Array{Float64,2} Pairs file ignored while using short circuits May 25, 2020
@frederikvand
Copy link
Author

frederikvand commented May 25, 2020

The behavior remains the same when the short circuits are not included. Could it be related to the amount of habitat patches/ points?

@frederikvand
Copy link
Author

frederikvand commented May 26, 2020

@ranjanan This might be related to #189?

@frederikvand frederikvand changed the title Pairs file ignored while using short circuits Pairs file ignored and output overwrites itself May 26, 2020
@vlandau
Copy link
Member

vlandau commented Jun 2, 2020

Parallel processing issue likely related to #165

@vlandau
Copy link
Member

vlandau commented Jun 2, 2020

I would strongly recommend looking into Omniscape for this problem. You might consider reading up about it here, and check out the docs for the Omniscape.jl Julia package here. You will likely still need a supercomputer for a problem of your size.

@ViralBShah
Copy link
Member

As I pointed out in email - this is a huge amount of compute on extremely large grids. I think looking at Omniscape is certainly a good idea.

@frederikvand
Copy link
Author

Thank you for the recommendations! The issues are not related to grid size as the same issues persist on small subsamples (as in the linked partial_raster.zip. However, I am testing omniscape on the supercomputer cluster at the moment!

@vlandau
Copy link
Member

vlandau commented Jun 2, 2020

Are you seeing a specific pair that is being written that should not be? Note that Circuitscape states the total number of pairs regardless of "include" status, and AFAIK it skips any pairs that are not in the include file, so it might say solving pair 3 of 5000000, but that does not mean it will solve all 5000000 pairs. Is that right @ranjanan?

@vlandau
Copy link
Member

vlandau commented Jun 2, 2020

You also have parallelize = false in your .ini, so parallel processing will be disabled.

@frederikvand
Copy link
Author

If I remember correct I tested this with only 5 points and it was not restraining to the given points only.
I will test this again to be sure! And ofcourse I tested it with parallelize = true. It was just an example ini file to show that the issues are the same with CG+ amg and without parallelize.

@vlandau
Copy link
Member

vlandau commented Jun 2, 2020

What OS, Julia version, and Circuitscape version are you using?

@frederikvand
Copy link
Author

frederikvand commented Jun 2, 2020

Linux, Julia Version 1.4.0 and Circuitscape 5.5.5 (however in the .out file it states version = 5.0.0). I update circuitscape each time in my script on the supercomputer with
using Pkg
Pkg.add("Circuitscape")
using Circuitscape

I tested again and as stated before, even with only 5 pairs in the pairs file it keeps computing pairs, overwriting output and not working in parallel.

mode include
64096 63142
64248 63142
63759 63142
64294 63142
64231 63142

Info: 2020-06-02 20:08:46 : Logs will recorded to file: log_file
[ Info: 2020-06-02 20:08:47 : Precision used: Double
[ Info: 2020-06-02 20:08:51 : Reading maps
[ Info: 2020-06-02 20:08:57 : Resistance/Conductance map has 10893750 nodes
[ Info: 2020-06-02 20:09:06 : Total number of pair solves = 10743930
[ Info: 2020-06-02 20:09:06 : Solving pair 1 of 10743930
[ Info: 2020-06-02 20:09:16 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:09:16 : Graph has 10893705 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:10:07 : Time taken to construct cholesky factor = 49.584559565
[ Info: 2020-06-02 20:10:09 : Time taken to construct local nodemap = 1.637285116 seconds
[ Info: 2020-06-02 20:10:09 : Solving points 1 to 1
[ Info: 2020-06-02 20:10:14 : Solving pair 2 of 10743930
[ Info: 2020-06-02 20:10:24 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:10:24 : Graph has 10893736 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:11:16 : Time taken to construct cholesky factor = 50.408244624
[ Info: 2020-06-02 20:11:18 : Time taken to construct local nodemap = 1.941791341 seconds
[ Info: 2020-06-02 20:11:18 : Solving points 1 to 1
[ Info: 2020-06-02 20:11:23 : Solving pair 3 of 10743930
[ Info: 2020-06-02 20:11:33 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:11:33 : Graph has 10893726 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:12:19 : Time taken to construct cholesky factor = 45.088503931
[ Info: 2020-06-02 20:12:21 : Time taken to construct local nodemap = 1.606452451 seconds
[ Info: 2020-06-02 20:12:21 : Solving points 1 to 1
[ Info: 2020-06-02 20:12:24 : Solving pair 4 of 10743930
[ Info: 2020-06-02 20:12:34 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:12:34 : Graph has 10893543 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:13:25 : Time taken to construct cholesky factor = 49.765484707
[ Info: 2020-06-02 20:13:27 : Time taken to construct local nodemap = 1.665921798 seconds
[ Info: 2020-06-02 20:13:27 : Solving points 1 to 1
[ Info: 2020-06-02 20:13:32 : Solving pair 5 of 10743930
[ Info: 2020-06-02 20:13:42 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:13:42 : Graph has 10893736 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:14:32 : Time taken to construct cholesky factor = 49.763152753
[ Info: 2020-06-02 20:14:34 : Time taken to construct local nodemap = 1.814617205 seconds
[ Info: 2020-06-02 20:14:34 : Solving points 1 to 1
[ Info: 2020-06-02 20:14:39 : Solving pair 6 of 10743930
[ Info: 2020-06-02 20:14:50 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:14:50 : Graph has 10893674 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:15:38 : Time taken to construct cholesky factor = 47.74451354
[ Info: 2020-06-02 20:15:40 : Time taken to construct local nodemap = 1.834993739 seconds
[ Info: 2020-06-02 20:15:40 : Solving points 1 to 1
[ Info: 2020-06-02 20:15:45 : Solving pair 7 of 10743930
[ Info: 2020-06-02 20:15:55 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:15:55 : Graph has 10893733 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:16:46 : Time taken to construct cholesky factor = 49.898639691
[ Info: 2020-06-02 20:16:48 : Time taken to construct local nodemap = 1.758777591 seconds
[ Info: 2020-06-02 20:16:48 : Solving points 1 to 1
[ Info: 2020-06-02 20:16:53 : Solving pair 8 of 10743930
[ Info: 2020-06-02 20:17:04 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:17:04 : Graph has 10893721 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:17:59 : Time taken to construct cholesky factor = 54.144434142
[ Info: 2020-06-02 20:18:01 : Time taken to construct local nodemap = 1.804120708 seconds
[ Info: 2020-06-02 20:18:01 : Solving points 1 to 1
[ Info: 2020-06-02 20:18:07 : Solving pair 9 of 10743930
[ Info: 2020-06-02 20:18:17 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:18:17 : Graph has 10893736 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:19:08 : Time taken to construct cholesky factor = 50.100582998
[ Info: 2020-06-02 20:19:10 : Time taken to construct local nodemap = 1.81247131 seconds
[ Info: 2020-06-02 20:19:10 : Solving points 1 to 1
[ Info: 2020-06-02 20:19:16 : Solving pair 10 of 10743930
[ Info: 2020-06-02 20:19:27 : Solver used: CHOLMOD
[ Info: 2020-06-02 20:19:27 : Graph has 10893736 nodes, 2 focal points and 1 connected components
[ Info: 2020-06-02 20:20:15 : Time taken to construct cholesky factor = 47.460205474
[ Info: 2020-06-02 20:20:17 : Time taken to construct local nodemap = 1.604487113 seconds

@frederikvand
Copy link
Author

Found a small mistake in the test case. I have to try some things again to be sure of the issue.

@vlandau
Copy link
Member

vlandau commented Jun 2, 2020

Okay good to know. Some additional info below:

To update the package you will need to run Pkg.update("Circuitscape") if an old version of Circuitscape is already installed.

It looks like you still have multiple pixels with the same value in your pairs file, so that will prevent parallel processing from working, see #232 (comment). I'm not sure if this also affects whether or not you can use included pairs. Will need @ranjanan to confirm.

As for the overwriting output issue, I'm not sure what is going on there. To clarify for the thread, essentially output files like *_resistances_3columns.out are being overwritten instead of appended to.

@frederikvand
Copy link
Author

@vlandau Thank you for all the assistance. I got a more or less working example on 1000m resolution and with a subset of pairs. However, there are still a lot of things puzzling me.

  1. Parallel processing takes longer than sequential processing (25% longer).
  2. If I use parallel processing on the 64 gig node I get ERROR: LoadError: ArgumentError: dense matrix construction failed for unknown reasons. Please submit a bug report. On 264 gig nodes
    this is solved but it is only using 1.7 cores and 800 mb memory after 50 hours of processing. The ini file here is set on 35 parallel processes and is still running.
  3. 100 points take approx. 400 sec to process but 1000 points take hours and hours. Isn't there a way to chunk calculations? It is just losing so much time and resources constructing matrixes of all possible pair combinations even when they are not used.
  4. I am not sure how the parallelisation works in the background but it seems to me it would be a lot more efficient to just chunk 100 points in 35 parallel sessions than what it is doing now. According to my benchmarking this would solve 10000 points in 20 min compared to 100's of hours now and would only use approx. 3.5 gig ram.

Any guidance would be helpful cause I got a lot more scenario's to test!

@frederikvand
Copy link
Author

frederikvand commented Jun 7, 2020

Ok, so I made this work by doing the following:

  1. Make short circuits beforehand by altering all resistance of patches to 1 and not include polygons in the ini file.
  2. Make sure no points touch eachother
  3. Make sure no more centroids are present than integer 32 can handle
  4. Don't exceed 15 million pixels
  5. Chunk points by a maximum of 300 and make seperate ini files for each chunk.
  6. Parallelize manually outside of Julia.

Happy it worked out but it seems some things could improve on the functionality of large pair datasets which is a neccesity to analyse which patches will be reachable in SDM modelling of future climates.

@vlandau
Copy link
Member

vlandau commented Jun 8, 2020

The next step for Circuitscape's parallel processing (which is currently being worked on) is to use multi threading instead of distributed parallel processing, which could help with the overhead problems you're experiencing. But because this is in the works, we don't want to put too much effort into the current parallel processing framework since it is going to be changed soon. I think the idea of processing in batches is a good one, and glad that worked out.

You should be able to pretty easily do single solves on grids much larger than 15 million pixels, but if you have 10's of thousands of pairs to solve, yeah, it would take a lot longer.

@frederikvand
Copy link
Author

frederikvand commented Jun 8, 2020

Glad to hear about the multi-threading! Also looking forward to the .tif functionality.

A simple statement in the ini file to determine the batch size of solves would fix a lot of the other problems I was experiencing. Would be good to know if circuitscape is able to handle integer 64 centroids. Because due to the interacting problems I am currently unsure.

You can close the issue if you wish!

Thank you!

With kind regards,
Frederik

@vlandau
Copy link
Member

vlandau commented Jun 8, 2020

There is a use_64bit_indexingargument (defaults to true) available on master (so is .tif functionality).

@frederikvand
Copy link
Author

frederikvand commented Jun 8, 2020

Was using the "use_64_bit_indexing" so not sure why I get the following error when using more than 400 pairs (same error occurs using CG + AMG) and thus when the amount of possible combination exceeds 32767 (int32) or when more than 32767 of different values are available in the raster.

ERROR: ArgumentError: dense matrix construction failed for unknown reasons. Please submit a bug report.
Stacktrace:
[1] SuiteSparse.CHOLMOD.Dense{Float64}(::Ptr{SuiteSparse.CHOLMOD.C_Dense{Float64}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:222
[2] Dense at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:238 [inlined]
[3] solve(::Int32, ::SuiteSparse.CHOLMOD.Factor{Float64}, ::SuiteSparse.CHOLMOD.Dense{Float64}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:762
[4] \ at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:1701 [inlined]
[5] (::SuiteSparse.CHOLMOD.Factor{Float64}, ::Array{Float64,2}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:1710
[6] _cholmod_solver_path(::Circuitscape.GraphData{Float64,Int64}, ::Circuitscape.RasterFlags, ::Dict{String,String}, ::Bool, ::Int64) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/core.jl:409
[7] single_ground_all_pairs(::Circuitscape.GraphData{Float64,Int64}, ::Circuitscape.RasterFlags, ::Dict{String,String}, ::Bool) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/core.jl:59
[8] single_ground_all_pairs at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/core.jl:53 [inlined]
[9] _pt_file_no_polygons_path(::Circuitscape.RasData{Float64,Int64}, ::Circuitscape.RasterFlags, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/raster/pairwise.jl:61
[10] raster_pairwise(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/raster/pairwise.jl:29
[11] _compute(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/run.jl:42
[12] macro expansion at ./util.jl:234 [inlined]

When I use tif I get the following error:

ERROR: Base.InvalidCharError{Char}('\xac')
Stacktrace:
[1] invalid_char(::Char) at ./char.jl:85
[2] UInt32(::Char) at ./char.jl:130
[3] convert at ./char.jl:180 [inlined]
[4] cconvert at ./essentials.jl:390 [inlined]
[5] lowercase(::Char) at ./strings/unicode.jl:245
[6] map(::typeof(lowercase), ::String) at ./strings/basic.jl:574
[7] lowercase at ./strings/unicode.jl:531 [inlined]
[8] _guess_file_type(::String, ::IOStream) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/io.jl:153
[9] _ascii_grid_read_header(::String, ::IOStream) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/io.jl:113
[10] _ascii_grid_reader(::Type{T} where T, ::String) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/io.jl:90
[11] read_cellmap(::String, ::Bool, ::Type{Float64}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/io.jl:68
[12] load_raster_data(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/io.jl:410
[13] raster_pairwise(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/raster/pairwise.jl:18
[14] _compute(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/run.jl:42
[15] macro expansion at ./util.jl:234 [inlined]
[16] compute(::String) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/3Rn8u/src/run.jl:31
[17] top-level scope at REPL[5]:1

@vlandau
Copy link
Member

vlandau commented Jun 8, 2020

The first error seems to be triggered by the SuiteSparse package (cc @ranjanan).

The second error suggests that you're not using the latest master. Running Pkg.status("Circuitscape") will let you know if that's that case. Did you use Pkg.add(PackageSpec(name="Circuitscape", rev="master")) to add the master version?

@frederikvand
Copy link
Author

Dear @vlandau ,

You are right. Installing the master solver solved the tif problem!
The only problem that now remains is the max. amount of pairs and that I lose a lot of processing time by chunking because it has to sequentially reread the map, construct graphs and cholesky factors for each batch. Furthermore I haven't found a solution to run my batches in parallel on the supercomputer node. Is there a way to run julia scripts in parallel on the nodes without interfering with circuitscape's internal processing?

@frederikvand frederikvand changed the title Pairs file ignored and output overwrites itself Maximum amount of pairs and batch processing Jun 16, 2020
@ViralBShah
Copy link
Member

ViralBShah commented Jun 16, 2020

Some of the discussion in #236 should help. You should be able to then create a bunch of ini files and do something like #236 to run different pairs on each node. Running on a distributed cluster is not at the moment an "out of the box" feature.

@frederikvand
Copy link
Author

frederikvand commented Jun 16, 2020

Dear @ViralBShah

Thank you for the information! I will try to make it work like that.
However, I seem to be getting a dense matrix construction error again for just 50-100 pairs without parallel processes. Do you have an idea what could cause this? The error seems to be quite random as I had it before on 400 pairs, then it went away and now it resurfaced without changing anything in the ini file. I get the error also with cg + amg. Only thing I can think of is that there are some points larger than 32767 in my point file for this current scenario. Is there no way to fix this bug? P.S. I am using 64bit_indexing.

Thank you for all the help!

[ Info: 2020-06-16 19:54:46 : Logs will recorded to file: log_file
[ Info: 2020-06-16 19:54:47 : Precision used: Double
[ Info: 2020-06-16 19:54:51 : Reading maps
[ Info: 2020-06-16 19:55:32 : Resistance/Conductance map has 90781250 nodes
[ Info: 2020-06-16 19:56:56 : Solver used: CHOLMOD
[ Info: 2020-06-16 19:56:56 : Graph has 90781250 nodes, 151 focal points and 1 connected components
[ Info: 2020-06-16 19:57:04 : Total number of pair solves = 11325
[ Info: 2020-06-16 20:06:01 : Time taken to construct cholesky factor = 520.0718625
[ Info: 2020-06-16 20:06:06 : Time taken to construct local nodemap = 5.278284888 seconds
[ Info: 2020-06-16 20:18:11 : Solving points 1 to 100

ERROR: LoadError: ArgumentError: dense matrix construction failed for unknown reasons. Please submit a bug report.
Stacktrace:
[1] SuiteSparse.CHOLMOD.Dense{Float64}(::Ptr{SuiteSparse.CHOLMOD.C_Dense{Float64}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:222
[2] Dense at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:238 [inlined]
[3] allocate_dense(::Int64, ::Int64, ::Int64, ::Type{Float64}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:414
[4] SuiteSparse.CHOLMOD.Dense{Float64}(::Array{Float64,2}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:810
[5] (::SuiteSparse.CHOLMOD.Factor{Float64}, ::Array{Float64,2}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:1710
[6] _cholmod_solver_path(::Circuitscape.GraphData{Float64,Int64}, ::Circuitscape.RasterFlags, ::Dict{String,String}, ::Bool, ::Int64) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/core.jl:414
[7] single_ground_all_pairs(::Circuitscape.GraphData{Float64,Int64}, ::Circuitscape.RasterFlags, ::Dict{String,String}, ::Bool) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/core.jl:59
[8] single_ground_all_pairs at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/core.jl:53 [inlined]
[9] _pt_file_no_polygons_path(::Circuitscape.RasData{Float64,Int64}, ::Circuitscape.RasterFlags, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/raster/pairwise.jl:61
[10] raster_pairwise(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/raster/pairwise.jl:29
[11] _compute(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/run.jl:42
[12] macro expansion at ./util.jl:234 [inlined]
[13] compute(::String) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/run.jl:31
[14] top-level scope at /data/leuven/330/vsc33060/input/Julia_scripts/patches_eu/03_try_400/01_current_100.jl:5
[15] include(::Module, ::String) at ./Base.jl:377
[16] exec_options(::Base.JLOptions) at ./client.jl:288
[17] _start() at ./client.jl:484
in expression starting at /data/leuven/330/vsc33060/input/Julia_scripts/patches_eu/03_try_400/01_current_100.jl:5

@frederikvand
Copy link
Author

I am going to make a new ticket with just this error because this bug is the essence of all my problems.

@ViralBShah
Copy link
Member

It would be best if you can provide the smallest example files that can reproduce the bug.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants