Peano
Loading...
Searching...
No Matches
CollocatedDLinearDiscretisationWithPointJacobi.py
Go to the documentation of this file.
1import peano4
2import jinja2
4import dastgen2
5import os
6
7from .Solver import Solver
8from .actionsets.CollocatedSolver.InitDofs import InitDofsCollocated
9from .actionsets.CollocatedSolver.UpdateSolution import UpdateSolution
10from .actionsets.CollocatedSolver.ResetAndUpdateResidual import ResetAndUpdateResidual
11from .actionsets.PlotVertexDataInPeanoBlockFormat import PlotVertexDataInPeanoBlockFormat
12
13from abc import abstractmethod
14from peano4.solversteps.ActionSet import ActionSet
15
16
18 """!
19
20 Matrix-free Collocated Solver for d-linear shape functions.
21
22 Mathematical and algorithmic descriptions are to follow later. For now,
23 we describe how to use this solver and a little about how it works under
24 the hood.
25
26 Each vertex stores a certain number of values, given by the parameter
27 unknowns_per_vertex. In essence, if this parameter is more than one,
28 we are solving multiple different equations simultaneously, or we
29 can equivalently think of our PDE being vector-valued.
30
31 The user needs to provide an assembly matrix, and a mass matrix. If
32 our vector-valued PDE has \f$ K \f$ components, then each of these
33 matrices should be square, with \f$ K * 2^D \f$ rows.
34
35 In particular, they should be structured as block diagonals. Let
36 \f$ A_i \f$ be square with \f$ 2^D \f$ rows. Then the mass/assembly
37 matrices should be structured as follows:
38
39 \f$ diag( A_1, A_2, \dots A_K ) \f$.
40
41 During the actual solver step, we run through each equation one
42 after another, fetching the appropriate mass and assembly matrices
43 for the equation in hand.
44
45 At present, it hasn't been decided how to plot the output of multiple
46 equations, since only 1 is supported.
47
48 # Data held on the vertices
49
50 ## Solution (formerly named Value)
51 This array holds the actual solution which we later plot. This is updated when we touch a vertex
52 for the last time.
53
54 ## Rhs
55
56 This array holds the right-hand side of the equation. We assign this in
57 experiment-specific implementation files.
58
59 ## Diagonal
60 This array holds the sum of the diagonals of the local assembly matrix. This is
61 set to 0 at the start of every solver step
62
63 ## Residual
64 This is used to store \f$ Mb - Ax \f$, i.e. the RHS multiplied by the matrix, and subtract
65 the solution multiplied by the assembly matrix.
66
67 # Solver Steps
68
69 ## Initialisation
70
71 After the grid is created, we run around the mesh and assign types to the vertices and cells.
72
73 Vertices:
74 - "Coarse" if the vertex is due to be refined
75 - "Interior" if it's not on the boundary
76 - "Boundary" if it is on the boundary
77
78 Cells:
79 - "Coarse" if it's due to be refined
80 - Otherwise marked as "Interior"
81
82 ## Solve
83
84 There is only one action set in this solver. We do something non-trivial during touchCellFirstTime,
85 touchVertexFirstTime and touchVertexLastTime.
86
87 ### touchVertexFirstTime
88
89 We check that the Vertex is not a coarse grid vertex, then we set the residuals and the diagonals to 0.
90
91 ### touchCellFirstTime
92
93 All four vertices are in scope here. We create three vectors, each of length \f$ 2^D \f$ to hold
94 one value per vertex. For each vertex in the cell (regardless of whether it is on the boundary), we :
95 - Update the residual to be \f$ Mb - Ax \f$
96 - Increment the diagonal
97
98 ### touchVertexLastTime
99 We update the solution value (on all vertices) to be equal to the solution at the previous sweep, and we add on
100 \f$ \omega \times r / d \f$
101 where \f$ r \f$ and \f$ d \f$ stand for the residual and diagonal respectively.
102
103 # Enforcement of boundary conditions
104
105
106 # Todos
107
108 We currently overcount the residual on vertices that are shared across boundaries. This isn't an issue at present,
109 since we only check for the max, which is unaffected by this bug. But something to fix at later date.
110
111 """
112
113 def __init__(self,
114 name,
115 unknowns_per_vertex,
116 dimensions,
117 min_h,
118 max_h,
119 local_assembly_matrix,
120 local_assembly_matrix_scaling,
121 mass_matrix,
122 mass_matrix_scaling,
123 solver_tolerance,
124 smoother_relaxation = 0.6,
125 custom_init_actionset = None,
126 custom_plot_actionset = None, # just pass in a list of tuples, one for the variable name and the other for the getter
127 custom_init_vertex = "",
128 custom_boundary_conditions = "",
129 enable_plotting = True, # sometimes we want to disable standard plotting...
130 plot_predicate = "true" # enable custom plotting behaviour
131 ):
132 """!
133
134 Collocated low-order (d-linear) Finite Elements
135
136 """
137 super( CollocatedDLinearDiscretisationWithPointJacobi, self ).__init__( name,
138 min_h,
139 max_h,
140 )
141
142 matrix_dims = (2**dimensions) * unknowns_per_vertex
143
144 # already initialised this in superclass!
145 self._vertex_data.data.add_attribute( peano4.dastgen2.Peano4DoubleArray( "u", str(unknowns_per_vertex) ) )
146 self._vertex_data.data.add_attribute( peano4.dastgen2.Peano4DoubleArray( "rhs", str(unknowns_per_vertex) ) )
147 self._vertex_data.data.add_attribute( peano4.dastgen2.Peano4DoubleArray( "diag", str(unknowns_per_vertex) ) )
148 self._vertex_data.data.add_attribute( peano4.dastgen2.Peano4DoubleArray( "residual", str(unknowns_per_vertex) ) )
149 self._vertex_data.peano4_mpi_and_storage_aspect.merge_implementation = self.get_vertex_merge_method()
150
151 self._unknowns_per_vertex_node = unknowns_per_vertex
152
153 self._local_assembly_matrix = local_assembly_matrix
154 self._local_assembly_matrix_scaling = local_assembly_matrix_scaling
155
156 self._mass_matrix = mass_matrix
157 self._mass_matrix_scaling = mass_matrix_scaling
158 self._solver_tolerance = solver_tolerance
159
161 self._smoother_relaxation = smoother_relaxation
162
163 self._custom_init_actionset = custom_init_actionset
164 self._custom_plot_actionset = custom_plot_actionset
165
166 self._custom_init_vertex = custom_init_vertex
167 self._custom_boundary_conditions = custom_boundary_conditions
168
169 self._enable_plotting = enable_plotting
170 self._plot_predicate = plot_predicate
171
172 def add_to_Peano4_datamodel( self, datamodel, verbose ):
173 super( CollocatedDLinearDiscretisationWithPointJacobi, self ).add_to_Peano4_datamodel(datamodel,verbose)
174
175 def add_use_statements(self, observer):
176 super( CollocatedDLinearDiscretisationWithPointJacobi, self ).add_use_statements(observer)
177
178 def add_to_plot(self, observer):
179 """!
180
181 Tell the project's observer how to plot the data
182
183 Nothing fancy here. We add plotters from Peano's toolbox to visualise
184 solution and right-hand side.
185
186 """
187 if self._enable_plotting: # true by default
188 observer.add_action_set( PlotVertexDataInPeanoBlockFormat(self, "u", "getU", plot_predicate=self._plot_predicate) )
189 observer.add_action_set( PlotVertexDataInPeanoBlockFormat(self, "rhs", "getRhs", plot_predicate=self._plot_predicate) )
190
192 for name, getter in self._custom_plot_actionset:
193 observer.add_action_set( PlotVertexDataInPeanoBlockFormat(self, name, getter, plot_predicate=self._plot_predicate) )
194
195 pass
196
197 def add_to_create_grid(self, observer):
198 observer.add_action_set(peano4.toolbox.CreateRegularGrid(self.max_hmax_h))
199 pass
200
201 def add_to_init_solution(self, observer):
202 """!
203
204 Solution initialisation. We can either use the
205 standard initialisation procedure or pass in
206 another one when we instantiate this solver
207 class.
208
209 """
210 if self._custom_init_actionset is None:
211 observer.add_action_set( InitDofsCollocated(self) )
212 else:
213 observer.add_action_set( self._custom_init_actionset(self) )
214 pass
215
216 def add_to_solve(self, observer):
217 descend_invocation_order = observer.highest_descend_invocation_order() + self._basic_descend_order + 1
218 observer.add_action_set( UpdateSolution( self, descend_invocation_order+0, False) )
219 observer.add_action_set( ResetAndUpdateResidual(self, descend_invocation_order+1, False) )
220 pass
221
222 def add_implementation_files_to_project(self, namespace, output, subdirectory="", abstract_overwrite=True):
223 """
224
225 The solver creates two classes: An abstract base class and its
226 implementations. The abstract base class will be overwritten if
227 there is one in the directory. I pipe all the Python constants
228 into this base class, so they are available in the C++ code.
229
230 """
231 templatefile_prefix = os.path.dirname( os.path.realpath(__file__) ) + "/templates/"
232 templatefile_prefix += self.__class__.__name__
233
234 if(subdirectory):
235 subdirectory += "/"
236
237 dictionary = {
238 "SOLVER_INCLUDES": "",
239 "MIN_H": self.min_hmin_h,
240 "MAX_H": self.max_hmax_h,
241 "LOCAL_ASSEMBLY_MATRIX": self._local_assembly_matrix,
242 "LOCAL_ASSEMBLY_MATRIX_SCALING": self._local_assembly_matrix_scaling,
243 "MASS_MATRIX" : self._mass_matrix,
244 "MASS_MATRIX_SCALING" : self._mass_matrix_scaling,
245 "VERTEX_CARDINALITY": self._unknowns_per_vertex_node,
246 "SOLVER_NAME": self.typename(),
247 "SOLVER_TOLERANCE" : self._solver_tolerance,
248 "SMOOTHER_RELAXATION": self._smoother_relaxation,
251 "CUSTOM_INIT_VERTEX" : self._custom_init_vertex,
252 "CUSTOM_BOUNDARY_CONDITIONS" : self._custom_boundary_conditions,
253 }
254
255 generated_abstract_header_file = peano4.output.Jinja2TemplatedHeaderImplementationFilePair(
256 templatefile_prefix + ".Abstract.template.h",
257 templatefile_prefix + ".Abstract.template.cpp",
258 "Abstract" + self.typename(),
259 namespace,
260 subdirectory + ".",
261 dictionary,
262 abstract_overwrite,
263 True)
265 templatefile_prefix + ".template.h",
266 templatefile_prefix + ".template.cpp",
267 self.typename(),
268 namespace,
269 subdirectory + ".",
270 dictionary,
271 False,
272 True)
273
274 output.add( generated_abstract_header_file )
275 output.add( generated_solver_files )
276 output.makefile.add_h_file( subdirectory + "Abstract" + self._name + ".h", generated=True )
277 output.makefile.add_h_file( subdirectory + self._name + ".h", generated=True )
278 output.makefile.add_cpp_file( subdirectory + "Abstract" + self._name + ".cpp", generated=True )
279 output.makefile.add_cpp_file( subdirectory + self._name + ".cpp", generated=True )
280
282 d = {}
283 d["SOLVER_NAME"] = self.typename()
284
285 output = """
286 if( !marker.willBeRefined() ) {
287
288 // naive += with the diag
289 setDiag( getDiag() + neighbour.getDiag() );
290
291 // ditto for residual
292 setResidual( getResidual() + neighbour.getResidual() );
293 }
294 """
295
296 return output
Specialisation of array using Peano's tarch.
Action set (reactions to events)
Definition ActionSet.py:6
add_implementation_files_to_project(self, namespace, output, subdirectory="", abstract_overwrite=True)
The solver creates two classes: An abstract base class and its implementations.
__init__(self, name, unknowns_per_vertex, dimensions, min_h, max_h, local_assembly_matrix, local_assembly_matrix_scaling, mass_matrix, mass_matrix_scaling, solver_tolerance, smoother_relaxation=0.6, custom_init_actionset=None, custom_plot_actionset=None, custom_init_vertex="", custom_boundary_conditions="", enable_plotting=True, plot_predicate="true" # enable custom plotting behaviour)
Collocated low-order (d-linear) Finite Elements.
Abstract base class for matrix free solvers.
Definition Solver.py:9
In touchVertexFirstTime, we set the type of vertex we see (either Interior, Boundary or Coarse),...
Definition InitDofs.py:6
This action set goes first, so that if we've done at least one sweep, we can update the solution base...