Peano 4
Loading...
Searching...
No Matches
peano4::grid::TraversalVTKPlotter Class Reference

Observer which pipes the automaton transitions into a VTK file. More...

#include <TraversalVTKPlotter.h>

Inheritance diagram for peano4::grid::TraversalVTKPlotter:
Collaboration diagram for peano4::grid::TraversalVTKPlotter:

Public Member Functions

 TraversalVTKPlotter (const std::string &filename, int treeId=-1)
 You have to invoke startNewSnapshot() if you wanna have a pvd file immediately after you've created this observer in the main code.
 
virtual ~TraversalVTKPlotter ()
 
virtual void beginTraversal (const tarch::la::Vector< Dimensions, double > &x, const tarch::la::Vector< Dimensions, double > &h) override
 Begin the traversal.
 
virtual void endTraversal (const tarch::la::Vector< Dimensions, double > &x, const tarch::la::Vector< Dimensions, double > &h) override
 
virtual void loadCell (const GridTraversalEvent &event) override
 
virtual void storeCell (const GridTraversalEvent &event) override
 
virtual void enterCell (const GridTraversalEvent &event) override
 Event is invoked per cell.
 
virtual void leaveCell (const GridTraversalEvent &event) override
 
virtual TraversalObserverclone (int spacetreeId) override
 
virtual std::vector< GridControlEventgetGridControlEvents () const override
 Obviously empty for this particular observer.
 
- Public Member Functions inherited from peano4::grid::TraversalObserver
virtual ~TraversalObserver ()
 
virtual void exchangeAllVerticalDataExchangeStacks (int masterId)
 Send local data from top level of local mesh to master and receive its top-down information in return.
 
virtual void exchangeAllHorizontalDataExchangeStacks (bool symmetricDataCardinality)
 Exchange all the data along the domain boundaries.
 
virtual void exchangeAllPeriodicBoundaryDataStacks ()
 Exchange all periodic boundary data.
 
virtual void streamDataFromSplittingTreeToNewTree (int newWorker)
 Stream data from current tree on which this routine is called to the new worker.
 
virtual void streamDataFromJoiningTreeToMasterTree (int masterId)
 
virtual void finishAllOutstandingSendsAndReceives ()
 Wrap up all sends and receives, i.e.
 
virtual void sendVertex (int position, int toStack, SendReceiveContext context, const GridTraversalEvent &event)
 
virtual void sendFace (int position, int toStack, SendReceiveContext context, const GridTraversalEvent &event)
 
virtual void sendCell (int toStack, SendReceiveContext context, const GridTraversalEvent &event)
 
virtual void receiveAndMergeVertex (int position, int fromStack, SendReceiveContext context, const GridTraversalEvent &event)
 
virtual void receiveAndMergeFace (int position, int fromStack, SendReceiveContext context, const GridTraversalEvent &event)
 
virtual void receiveAndMergeCell (int fromStack, SendReceiveContext context, const GridTraversalEvent &event)
 
virtual void deleteAllStacks ()
 

Protected Member Functions

void plotCell (const GridTraversalEvent &event)
 Does the actual plotting, i.e.
 

Protected Attributes

const std::string _filename
 
const int _spacetreeId
 

Static Protected Attributes

static tarch::logging::Log _log
 

Private Attributes

tarch::plotter::griddata::unstructured::vtk::VTUTextFileWriter_writer
 
tarch::plotter::griddata::unstructured::UnstructuredGridWriter::VertexWriter_vertexWriter
 
tarch::plotter::griddata::unstructured::UnstructuredGridWriter::CellWriter_cellWriter
 
tarch::plotter::griddata::unstructured::UnstructuredGridWriter::CellDataWriter_spacetreeIdWriter
 
tarch::plotter::griddata::unstructured::UnstructuredGridWriter::CellDataWriter_coreWriter
 

Static Private Attributes

static tarch::mpi::BooleanSemaphore _sempahore
 

Additional Inherited Members

- Public Types inherited from peano4::grid::TraversalObserver
enum class  SendReceiveContext {
  BoundaryExchange , MultiscaleExchange , ForkDomain , JoinDomain ,
  PeriodicBoundaryDataSwap
}
 There are three different scenarios when we merge data: More...
 
- Static Public Attributes inherited from peano4::grid::TraversalObserver
static constexpr int NoRebalancing = -1
 
static constexpr int NoData = -1
 Can this grid entity hold data.
 
static constexpr int CreateOrDestroyPersistentGridEntity = -2
 Implies that the data will then be local or had been local.
 
static constexpr int CreateOrDestroyHangingGridEntity = -3
 Implies that the data will then be local or had been local.
 

Detailed Description

Observer which pipes the automaton transitions into a VTK file.

While we use the up-to-date vtk format, the observer plots the whole thing as a discontinuous unstructured mesh. It is not particular sophisticated.

The plotter can write whole time series. For this, you have to invoke startNewSnapshot() prior to each plot. It is the latter which also ensures that parallel plots in an MPI environment do work.

Parallel plotting

Each tree dumps its own vtk file. That is, each thread and each rank in theory might write its file parallel to the other guys. VTK/VTU offers us to define a metafile (pvtu) which collocates various dumps. As we create one observer per thread through clone(), every thread on every rank has its instance and pipes its data. getFilename() ensures that no file is overwritten. It combines the tree number with a counter, and _counter, which is static, is incremented through endTraversalOnRank() which I expect the user to call once after each traversal.

Known bugs

As the MPI domain decomposition creates fake observers for the master of a local rank when it is created, we'll have multiple entries for forking ranks in the meta file.

Definition at line 48 of file TraversalVTKPlotter.h.

Constructor & Destructor Documentation

◆ TraversalVTKPlotter()

peano4::grid::TraversalVTKPlotter::TraversalVTKPlotter ( const std::string & filename,
int treeId = -1 )

You have to invoke startNewSnapshot() if you wanna have a pvd file immediately after you've created this observer in the main code.

If this guy is ran on the global master,

Definition at line 22 of file TraversalVTKPlotter.cpp.

◆ ~TraversalVTKPlotter()

peano4::grid::TraversalVTKPlotter::~TraversalVTKPlotter ( )
virtual

Definition at line 32 of file TraversalVTKPlotter.cpp.

Member Function Documentation

◆ beginTraversal()

void peano4::grid::TraversalVTKPlotter::beginTraversal ( const tarch::la::Vector< Dimensions, double > & x,
const tarch::la::Vector< Dimensions, double > & h )
overridevirtual

Begin the traversal.

This routine is called per spacetree instance, i.e. per subtree (thread) per rank. Within the usual implementation, everything will reside on the call stack anyway. If the routine is called on tree no 0, this operation has to establish the master data of the global root tree, i.e. ensure that the data of level -1 is technically there for the subsequent enterCell event, though this data is ill-defined.

Parameters
xRoot cell coordinates
hRoot cell size

Implements peano4::grid::TraversalObserver.

Definition at line 34 of file TraversalVTKPlotter.cpp.

References tarch::mpi::Rank::getInstance(), tarch::mpi::Rank::getRank(), and tarch::plotter::PVDTimeSeriesWriter::NoIndexFile.

Here is the call graph for this function:

◆ clone()

peano4::grid::TraversalObserver * peano4::grid::TraversalVTKPlotter::clone ( int spacetreeId)
overridevirtual
 I use the clone to create one observer object per traversal thread. So
 between different spacetrees of one spacetree set, there can be no race
 condition. Yet, the clone() itself could be called in parallel.

 \section  Global per-sweep actions

 If you want to implement an operation once per sweep in a parallel
 environment, then you can exploit the fact that the spacetree set also
 creates an observer for the global master thread, i.e. tree no 0. So if
 you add a statement alike

 <pre>

if (peano4::parallel::Node::isGlobalMaster(spacetreeId)) { ... }

then you can be sure that the branch body is executed only once globally per grid sweep.

The counterpart of the clone operation is the destructor.

Implements peano4::grid::TraversalObserver.

Definition at line 142 of file TraversalVTKPlotter.cpp.

◆ endTraversal()

void peano4::grid::TraversalVTKPlotter::endTraversal ( const tarch::la::Vector< Dimensions, double > & x,
const tarch::la::Vector< Dimensions, double > & h )
overridevirtual
See also
beginTraversal()

Implements peano4::grid::TraversalObserver.

Definition at line 63 of file TraversalVTKPlotter.cpp.

References assertion.

◆ enterCell()

void peano4::grid::TraversalVTKPlotter::enterCell ( const GridTraversalEvent & event)
overridevirtual

Event is invoked per cell.

It is however not called for the root cell, i.e. for the cell with level 0 that does not have a parent.

Implements peano4::grid::TraversalObserver.

Definition at line 100 of file TraversalVTKPlotter.cpp.

◆ getGridControlEvents()

std::vector< peano4::grid::GridControlEvent > peano4::grid::TraversalVTKPlotter::getGridControlEvents ( ) const
overridevirtual

Obviously empty for this particular observer.

Implements peano4::grid::TraversalObserver.

Definition at line 149 of file TraversalVTKPlotter.cpp.

◆ leaveCell()

void peano4::grid::TraversalVTKPlotter::leaveCell ( const GridTraversalEvent & event)
overridevirtual

Implements peano4::grid::TraversalObserver.

Definition at line 138 of file TraversalVTKPlotter.cpp.

◆ loadCell()

void peano4::grid::TraversalVTKPlotter::loadCell ( const GridTraversalEvent & event)
overridevirtual

Implements peano4::grid::TraversalObserver.

Definition at line 92 of file TraversalVTKPlotter.cpp.

◆ plotCell()

void peano4::grid::TraversalVTKPlotter::plotCell ( const GridTraversalEvent & event)
protected

Does the actual plotting, i.e.

all checks/decision making is already done before

Definition at line 109 of file TraversalVTKPlotter.cpp.

References assertion, dfor2, enddforx, peano4::grid::GridTraversalEvent::getH(), tarch::multicore::Core::getInstance(), peano4::grid::GridTraversalEvent::getX(), k, logError, tarch::la::multiplyComponents(), and TwoPowerD.

Here is the call graph for this function:

◆ storeCell()

void peano4::grid::TraversalVTKPlotter::storeCell ( const GridTraversalEvent & event)
overridevirtual

Implements peano4::grid::TraversalObserver.

Definition at line 96 of file TraversalVTKPlotter.cpp.

Field Documentation

◆ _cellWriter

tarch::plotter::griddata::unstructured::UnstructuredGridWriter::CellWriter* peano4::grid::TraversalVTKPlotter::_cellWriter
private

Definition at line 65 of file TraversalVTKPlotter.h.

◆ _coreWriter

tarch::plotter::griddata::unstructured::UnstructuredGridWriter::CellDataWriter* peano4::grid::TraversalVTKPlotter::_coreWriter
private

Definition at line 67 of file TraversalVTKPlotter.h.

◆ _filename

◆ _log

tarch::logging::Log peano4::grid::TraversalVTKPlotter::_log
staticprotected

Definition at line 50 of file TraversalVTKPlotter.h.

◆ _sempahore

tarch::mpi::BooleanSemaphore peano4::grid::TraversalVTKPlotter::_sempahore
staticprivate

Definition at line 69 of file TraversalVTKPlotter.h.

◆ _spacetreeId

const int peano4::grid::TraversalVTKPlotter::_spacetreeId
protected

Definition at line 53 of file TraversalVTKPlotter.h.

◆ _spacetreeIdWriter

tarch::plotter::griddata::unstructured::UnstructuredGridWriter::CellDataWriter* peano4::grid::TraversalVTKPlotter::_spacetreeIdWriter
private

Definition at line 66 of file TraversalVTKPlotter.h.

◆ _vertexWriter

tarch::plotter::griddata::unstructured::UnstructuredGridWriter::VertexWriter* peano4::grid::TraversalVTKPlotter::_vertexWriter
private

Definition at line 64 of file TraversalVTKPlotter.h.

◆ _writer

tarch::plotter::griddata::unstructured::vtk::VTUTextFileWriter* peano4::grid::TraversalVTKPlotter::_writer
private

Definition at line 63 of file TraversalVTKPlotter.h.


The documentation for this class was generated from the following files: