Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: p4est + :zeromean + block assembly #64

Closed
JordiManyer opened this issue Mar 27, 2024 · 1 comment · Fixed by gridap/GridapDistributed.jl#147
Closed

Bug: p4est + :zeromean + block assembly #64

JordiManyer opened this issue Mar 27, 2024 · 1 comment · Fixed by gridap/GridapDistributed.jl#147
Labels
bug Something isn't working

Comments

@JordiManyer
Copy link
Member

JordiManyer commented Mar 27, 2024

It is a convoluted one...

I've found a bug that seems to be caused by the combination of p4est models AND :zeromean constraints AND block assembly. If we run the following code, the last weakform produces the error below.

If we uncomment one of the commented lines, i.e if

  • we run in serial (or MPI with a single processor)
  • OR we run with a distributed cartesian model
  • OR we run without the :zeromean space
  • OR we run without BlockAssembly

the error message goes away and everything runs fine. So it's activated by a combination of all four. My guess is that something must be going on with the ghost dof ids of the individual fespaces we create.

using Gridap
using Gridap.ReferenceFEs, Gridap.Algebra, Gridap.Geometry, Gridap.FESpaces
using Gridap.CellData, Gridap.MultiField, Gridap.Algebra
using PartitionedArrays
using GridapDistributed
using GridapP4est

np = 4
parts = with_mpi() do distribute 
  distribute(LinearIndices((np,)))
end

nc = (8,8)
domain = (0,1,0,1)
cmodel = CartesianDiscreteModel(domain,nc)

num_refs_coarse = 0
model = OctreeDistributedDiscreteModel(parts,cmodel,num_refs_coarse)
#model = CartesianDiscreteModel(parts,(2,2),domain,nc) # ALL TESTS RUN OK

order = 2
reffe_u = ReferenceFE(lagrangian,VectorValue{2,Float64},order)
reffe_p = ReferenceFE(lagrangian,Float64,order-1,space=:P)

V = TestFESpace(model,reffe_u)
Q = TestFESpace(model,reffe_p;conformity=:L2,constraint=:zeromean)
#Q = TestFESpace(model,reffe_p;conformity=:L2) # ALL TESTS RUN OK

mfs = Gridap.MultiField.BlockMultiFieldStyle()
#mfs = Gridap.MultiField.ConsecutiveMultiFieldStyle() # ALL TESTS RUN OK
X = MultiFieldFESpace([V,Q];style=mfs)
Y = MultiFieldFESpace([Q,Q];style=mfs)

qdegree = 4
Ω = Triangulation(model)
dΩ = Measure(Ω,qdegree)

m(p,q) = (p*q)dΩ
M = assemble_matrix(m,Q,Q) # OK

n(u,q) = ((∇u)*q)dΩ
N = assemble_matrix(n,V,Q) # OK

l((p1,p2),(q1,q2)) = (p1*q1 + p2*q2 + p1*q2)dΩ
L = assemble_matrix(l,Y,Y) # OK

b((u,p),(v,q)) = ((v)(u))dΩ + m(p,q)
B = assemble_matrix(b,X,X) # OK

a((u,p),(v,q)) = ((v)(u))dΩ + m(p,q) - ((∇v)*p)dΩ - ((∇u)*q)dΩ
A = assemble_matrix(a,X,X) # FAILS
ERROR: LoadError: BoundsError: attempt to access 71-element Vector{Int32} at index [0]
Stacktrace:
  [1] getindex
    @ ./essentials.jl:13 [inlined]
  [2] getindex
    @ ./abstractarray.jl:1299 [inlined]
  [3] #130
    @ ~/.julia/packages/GridapDistributed/DaF72/src/Algebra.jl:966 [inlined]
  [4] iterate
    @ ./generator.jl:47 [inlined]
  [5] collect_to!(dest::Vector{Int32}, itr::Base.Generator{Vector{Int64}, GridapDistributed.var"#130#132"{Vector{Int32}, PartitionedArrays.VectorFromDict{Int64, Int32}}}, offs::Int64, st::Int64)
    @ Base ./array.jl:840
  [6] collect_to_with_first!
    @ ./array.jl:818 [inlined]
  [7] _collect(c::Vector{Int64}, itr::Base.Generator{Vector{Int64}, GridapDistributed.var"#130#132"{Vector{Int32}, PartitionedArrays.VectorFromDict{Int64, Int32}}}, #unused#::Base.EltypeUnknown, isz::Base.HasShape{1})
    @ Base ./array.jl:812
  [8] collect_similar
    @ ./array.jl:711 [inlined]
  [9] map
    @ ./abstractarray.jl:3263 [inlined]
 [10] #129
    @ ~/.julia/packages/GridapDistributed/DaF72/src/Algebra.jl:966 [inlined]
 [11] map(::GridapDistributed.var"#129#131", ::MPIArray{Vector{Int64}, 1}, ::MPIArray{LocalIndices, 1})
    @ PartitionedArrays ~/.julia/packages/PartitionedArrays/py6uo/src/mpi_array.jl:229
 [12] #get_gid_owners#128
    @ ~/.julia/packages/GridapDistributed/DaF72/src/Algebra.jl:963 [inlined]
 [13] get_gid_owners(I::MPIArray{Vector{Int64}, 1}, ids::PRange{MPIArray{LocalIndices, 1}})
    @ GridapDistributed ~/.julia/packages/GridapDistributed/DaF72/src/Algebra.jl:962
 [14] (::GridapDistributed.var"#161#164"{Symbol, Bool, Matrix{MPIArray{Vector{Int32}, 1}}, Matrix{MPIArray{Vector{Int64}, 1}}})(id::Int64, prange::PRange{MPIArray{LocalIndices, 1}})
    @ GridapDistributed ~/.julia/packages/GridapDistributed/DaF72/src/Algebra.jl:1138
 [15] #4
    @ ./generator.jl:36 [inlined]
 [16] iterate
    @ ./generator.jl:47 [inlined]
 [17] collect_to!(dest::Vector{Tuple{MPIArray{Vector{Int64}, 1}, MPIArray{Vector{Int32}, 1}}}, itr::Base.Generator{Base.Iterators.Zip{Tuple{LinearIndices{1, Tuple{Base.OneTo{Int64}}}, Vector{PRange{MPIArray{LocalIndices, 1}}}}}, Base.var"#4#5"{GridapDistributed.var"#161#164"{Symbol, Bool, Matrix{MPIArray{Vector{Int32}, 1}}, Matrix{MPIArray{Vector{Int64}, 1}}}}}, offs::Int64, st::Tuple{Int64, Int64})
    @ Base ./array.jl:840
 [18] collect_to_with_first!(dest::Vector{Tuple{MPIArray{Vector{Int64}, 1}, MPIArray{Vector{Int32}, 1}}}, v1::Tuple{MPIArray{Vector{Int64}, 1}, MPIArray{Vector{Int32}, 1}}, itr::Base.Generator{Base.Iterators.Zip{Tuple{LinearIndices{1, Tuple{Base.OneTo{Int64}}}, Vector{PRange{MPIArray{LocalIndices, 1}}}}}, Base.var"#4#5"{GridapDistributed.var"#161#164"{Symbol, Bool, Matrix{MPIArray{Vector{Int32}, 1}}, Matrix{MPIArray{Vector{Int64}, 1}}}}}, st::Tuple{Int64, Int64})
    @ Base ./array.jl:818
 [19] collect(itr::Base.Generator{Base.Iterators.Zip{Tuple{LinearIndices{1, Tuple{Base.OneTo{Int64}}}, Vector{PRange{MPIArray{LocalIndices, 1}}}}}, Base.var"#4#5"{GridapDistributed.var"#161#164"{Symbol, Bool, Matrix{MPIArray{Vector{Int32}, 1}}, Matrix{MPIArray{Vector{Int64}, 1}}}}})
    @ Base ./array.jl:792
 [20] map
    @ ./abstractarray.jl:3385 [inlined]
 [21] _setup_prange(dofs_gids_prange::Vector{PRange{MPIArray{LocalIndices, 1}}}, gids::Matrix{MPIArray{Vector{Int64}, 1}}; ax::Symbol, ghost::Bool, owners::Matrix{MPIArray{Vector{Int32}, 1}})
    @ GridapDistributed ~/.julia/packages/GridapDistributed/DaF72/src/Algebra.jl:1132
 [22] _setup_prange
    @ ~/.julia/packages/GridapDistributed/DaF72/src/Algebra.jl:1126 [inlined]
 [23] _sa_create_from_nz_with_callback(callback::GridapDistributed.var"#f#111", async_callback::GridapDistributed.var"#f#111", a::Gridap.Fields.MatrixBlock{GridapDistributed.DistributedAllocationCOO{SubAssembledRows, MPIArray{Gridap.Algebra.AllocationCOO{SparseArrays.SparseMatrixCSC{Float64, Int64}, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, Vector{Int64}, Vector{Float64}}, 1}, PRange{MPIArray{LocalIndices, 1}}, PRange{MPIArray{LocalIndices, 1}}}}, b::Nothing)
    @ GridapDistributed ~/.julia/packages/GridapDistributed/DaF72/src/Algebra.jl:603
 [24] create_from_nz(a::Gridap.Fields.MatrixBlock{GridapDistributed.DistributedAllocationCOO{SubAssembledRows, MPIArray{Gridap.Algebra.AllocationCOO{SparseArrays.SparseMatrixCSC{Float64, Int64}, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, Vector{Int64}, Vector{Float64}}, 1}, PRange{MPIArray{LocalIndices, 1}}, PRange{MPIArray{LocalIndices, 1}}}})
    @ GridapDistributed ~/.julia/packages/GridapDistributed/DaF72/src/Algebra.jl:569
 [25] assemble_matrix(a::Gridap.MultiField.BlockSparseMatrixAssembler{2, 2, (1, 1), (1, 2), GridapDistributed.DistributedSparseMatrixAssembler{SubAssembledRows, MPIArray{GenericSparseMatrixAssembler, 1}, GridapDistributed.PSparseMatrixBuilderCOO{SparseArrays.SparseMatrixCSC{Float64, Int64}, SubAssembledRows}, GridapDistributed.PVectorBuilder{BlockArrays.BlockVector{Float64, Vector{Vector{Float64}}, Tuple{BlockArrays.BlockedUnitRange{Vector{Int64}}}}, SubAssembledRows}, PRange{MPIArray{LocalIndices, 1}}, PRange{MPIArray{LocalIndices, 1}}}}, matdata::MPIArray{Tuple{Vector{Any}, Vector{Any}, Vector{Any}}, 1})
    @ Gridap.FESpaces ~/.julia/packages/Gridap/j2cSX/src/FESpaces/SparseMatrixAssemblers.jl:72
@JordiManyer JordiManyer added the bug Something isn't working label Mar 27, 2024
@JordiManyer
Copy link
Member Author

This is caused by a bug in distributed block-assembly. It has nothing to do with p4est, but rather p4est's dof ordering was activating the bug in GridapDistrbuted. Fixed by gridap/GridapDistributed.jl#147

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant