Singularity Structure Simplification of Hexahedral Mesh via Weighted Ranking
Gang Xu, Ran Ling, Jessica Zhang, Zhoufang Xiao, Zhongping Ji, Timon Rabczuk
SSingularity Structure Simplification of Hexahedral Meshvia Weighted Ranking
Gang Xu a, ∗ , Ran Ling a , Yongjie Jessica Zhang c , Zhoufang Xiao a , Zhongping Ji a , Timon Rabczuk d a School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou 310018, China b Key Laboratory of Complex Systems Modeling and Simulation, Ministry of Education, Hangzhou 310018, China c Department of Mechanical Engineering, Carnegie Mellon University, USA d Institute of Structural Mechanics, Bauhaus-Universitat Weimar, Germany
Abstract
In this paper, we propose an improved singularity structure simplification method for hexahedral (hex)meshes using a weighted ranking approach. In previous work, the selection of to-be-collapsed base complexsheets/chords is only based on their thickness, which will introduce a few closed-loops and cause an earlytermination of simplification and a slow convergence rate. In this paper, a new weighted ranking function isproposed by combining the valence prediction function of local singularity structure, shape quality metricof elements and the width of base complex sheets/chords together. Adaptive refinement and local opti-mization are also introduced to improve the uniformity and aspect ratio of mesh elements. Compared tothickness ranking methods, our weighted ranking approach can yield a simpler singularity structure withfewer base-complex components, while achieving comparable Hausdorff distance ratio and better mesh qual-ity. Comparisons on a hex-mesh dataset are performed to demonstrate the effectiveness of the proposedmethod.
Keywords: hex-mesh, singularity structure simplification, weighted ranking, uniformity, base complex
1. Introduction
In recent years, the application of hexahedral (hex) meshes in finite element and isogeometric analysishas become increasingly widespread, because of its good numerical performance, small storage space re-quirements, and natural advantage of being able to construct tensor-product splines. However, hex-meshgeneration is not yet mature, and it cannot be guaranteed that a good quality initial mesh can be gener-ated in all cases. For complex shapes and structural models, the octree-based mesh generation method wasproposed [1, 2]. This method is efficient and robust, and it can ensure a topologically valid and well-formedmeshing result. However, it generates a large number of cells and too many singularities. In some scenarios,we do not need a dense mesh and complicated interior structures. Meshes with simple structure and fewersingularities are more conducive to accelerating computational and convergence speed [3]. Therefore, it isvery important to propose an effective singularity structure simplification method for hex-meshes.Some research work has contributed to this topic in the past 10 years. In [4], an adaptive hex-meshlocalization method was proposed. Topological operations such as collapsing and pillowing are used toprocess the locality, and localized roughening is maintained while maintaining topological connectivity andshape of the input mesh, which provides a basic idea of hex-mesh coarsening. In [5], the mesh structureis simplified according to the reparameterization requirements, and singularity is effectively reduced whilemaintaining the number of mesh elements. Template matching is used to split patches and eliminate theleading blocks. However, its implementation is very limited and not robust. It cannot simplify self-interleavedand closed loops, resulting in poor results on input meshes obtained from octree-based methods. In [6], a ∗ Corresponding author. Email: [email protected].
Preprint submitted to XXX January 4, 2019 a r X i v : . [ c s . C G ] J a n obust hex-mesh structure simplification method was proposed. It is possible that a feasible solution witha simpler and coarser structure exists, but the algorithm might fail to find it. Especially, the rankingmethod for the selection of to-be-collapsed base complex sheets/chords is only based on the thickness, andit cannot guarantee to remove most of the singular structures. It will also introduce a few closed-loopsand terminate the simplification process in advance. For an initial hex-mesh with many singular vertices, aproper priority ranking algorithm is needed to guide the simplification of the singularity structure. Moreover,a local parameterization is also needed to improve the mesh quality and repair topology structure aftersimplification. In this paper, we propose an improved singularity structure simplification method of hex-meshes. The main contribution can be summarized as follows: • A new weighted ranking approach for singularity structure simplification is proposed by combiningthe valence prediction function of local singularity structure, shape quality metric of elements and thewidth of base complex sheets/chords. • A local optimization for SLIM [7] is proposed to improve the uniformity of hex-elements while main-taining the element quality; • An adaptive sheet refinement method is proposed to preserve surface features while maintaining similarnumber of hex-elements.Based on these improvements, the proposed weighted ranking method can achieve a smaller number ofsingularities with comparable Hausdorff distance ratio, effectively remove the presence of kinks in the hex-mesh, and yield better mesh quality compared to the thickness ranking method [6].The remainder of the paper is structured as follows. A review of related hex-mesh generation and meshsimplification is presented in Section 2. Some basic concepts and framework overview are described inSection 3. Section 4 presents the sheet and chord collapsing operation of base-complex. The proposedweighted ranking approach is described in Section 5. Adaptive sheet refinement is presented in Section 6.In Section 7, the experimental results are illustrated. Finally, the paper is concluded and future work isoutlined in Section 8.
2. Related WorkHexaheral mesh generation . Hex mesh has been widely studied for decades. However, an automaticmethod that can generate high quality hex-meshes for any complex geometry is still unavailable because ofthe strong topological constraints [8], i.e., the dual chord and the dual sheet. Unlike tetrahedral meshes,any local changes in the mesh would propagate to the whole mesh by dual chords or dual sheets [8],which makes hex-mesh generation a very challenging task. Some methods were devised for specific typesof geometries. For example, the mapping method is very preferable for mappable geometries, while thesweeping method [9] is often used for swept volumes. By combining with domain partition, they can beapplied to complex geometries [10] [9]. Based on the idea of paving, several geometric and topologicalapproaches have been proposed for all-hex meshing. Plastering [11] and H-Morph [12] generate layers ofhex elements in geometric ways, whereas the whisker weaving [13] [14] method uses spatial twist continuumand generates the topological dual of hex-mesh. Unconstrained plastering [15] is extended from plastering.Different from other paving methods, it starts from propagating the original geometry boundary instead ofa pre-meshed boundary into the interior domain, and hex elements are generated when three propagatingfronts intersect each other. The octree-based approach [16] is very robust and can be executed in a highlyautomatic way, however, it yields poor quality elements near boundary and the final mesh heavily relieson the orientation of the coordinate system. The polycube based meshing approach uses a low distortionmapping between the input model and polycube, and computes the corresponding volumetric mappings. Thedeformation methods are introduced for polycube construction [17, 18, 19, 20], and frame fields are proposedto guide the polycube construction [21, 22]. In [23], Nieser et al. computes a global parameterization of thevolume on the basis of a frame filed to construct hex-meshes. Theoretical conditions on singularities and thegradient frame field are derived for degenerated parameterization, and badly placed singularities can lead2o distortion. Based on spherical harmonics representation, Huang et al. [24] generated a boundary-alignedsmooth frame field by minimizing an energy function. Though impressive results were obtained from theframe field based approaches, further efforts are still needed for practical use.
Mesh simplification . Mesh simplification generally reduces the number of elements and maximizesthe appearance of the original mesh by performing local coarsening operations. Triangular elements can becombined with the edge flipping operation and local MSL form of the minimum energy function. This methodwas also applied to hierarchical mesh generation with step by step simplification. In quadrilateral and hex-mesh simplification, similar local operations were also proposed [25, 26]. Sheets and chords are extractedby the inherent dual structure, and the local operation is simplified for the object [5, 6]. Recent progress instructure simplification has achieved great success in polycube simplification [27] and hex-mesh optimization[28]. In [27], the singularity misalignment problem was solved directly in the polycube space, and thecorner optimization strategy was introduced to produce coarser block structured surface and volumetricmeshes. Moreover, the induced meshes are suited for spline fitting. Topology control operations in hex-meshsimplification can also be applied to adjusting low quality mesh elements. In [28], an adjustment strategy forrepairing the inverted elements was proposed by combining the basic mesh editing operations with frame fieldoptimization. Based on the singularity structure in the mesh, a base-complex block structure is extractedin [6]. Then the simplification operation is performed to collapses base complex sheets and chords whileredistributing the distortion based on a volumetric parametrization. However, the selection of appropriatebase complex sheets/chords to be collapsed is only based on their thickness, which will introduce a fewclosed-loops, cause an early termination of simplification and a slow convergence rate. In this paper, a newweighted ranking function will be proposed by combining the valence prediction function of local singularitystructure, shape quality metric of elements and the width of base complex sheets/chords.
3. Basic concepts and framework overview
The proposed hex-mesh simplification can effectively reduce the singularity structure while maintainingthe specified number of elements. In this section, we briefly introduce the definition of singularity structure,base-complex and two types of structure called base-complex sheet and base-complex chord . Base-complex . The valence of vertex, edge and face is denoted as the number of its neighboring hexelements. A vertex is said to be regular if its valence is four on the boundary or eight in the interior.Similar to the regular vertex, an edge is regular when its valence is two on the boundary or four in theinterior. Then a series of connected irregular edges with the same valence compose of a singular edge, andits two ending vertices are called singular vertices , except the case of closed singular edges. The singularitystructure is composed of these singular edges and singular vertices. According to the above definitions, wecan extract the singularity structure of a hex-mesh. Each singular edge with a valence of n can be extendedto n segmented surfaces, and the valid manifold hex-mesh can be divided into cube-like components bythese segmented surfaces (refer to [5] for more details). A segmented structure called base-complex can beextracted in this way. The base-complex of the hex-mesh H is denoted as B = ( B V , B E , B F , B C ), where B C is the set of cube-like components (composed of hex elements), B V and B E are the set of 8 cornersof each cube-like component and the set of base-complex edges (a series of connected edges between twobase-complex vertices) respectively, and B F contains base-complex faces of each component.Base-complex sheet and base-complex chord can be extracted based on the base-complex structure. Sinceeach of these components aligns with its adjacent components with C continuity, and the singularities arelocated at its eight corners and three groups of four topologically parallel base-complex edges. Removingcomponents can effectively simplify singularity structure by collapsing base-complex sheets and chords. Thebase-complex sheet S consists of three parts: the left surface F L (or the right surface F R ) contains allbase-complex vertices, edges and faces in the boundary of the left (or right) part, and the middle volume E M contains the base-complex edges with two end nodes on F L and F R respectively. Topology elements in F L and F R can form element groups. Base-complex chord has a similar definition, in which two sides followthe main diagonal direction. Fig. 1 shows the structure of base-complex sheet and base-complex chord. Framework overview . As shown in Algorithm 1, we propose an improved singularity structure simpli-fication method for hex-meshes while maintaining the shape boundary and the target number of elements.3 a) (b) (c) (d)
Figure 1: (a) The base-complex sheet (green elements) consists of the left surface F L , the right surface F R and the middlevolume E M , with the edge pair (yellow edges) and the vertex pair (red dots) shown in (b). (c) The green elements form abase-complex chord, where F L and F R in (d) can be determined from the main diagonal direction. Algorithm 1
Framework of singularity structure simplification
Input:
A hex-mesh M ; Target number of mesh elements, N c ; Target reduction ratio of components, N s ;The current number of elements, n s ; Output:
Hex mesh with a simplified base-complex, m out ; Extract the base-complex structure B = ( B V , B E , B F , B C ) from M , secondary detecting until no irreg-ular components is found; Extract all base-complex sheets and chords which satisfy filtering criteria, then push them into twopriority queues S sheet and S chord separately, the queue lengths are k s , k c ; Find the top-ranked base-complex sheet and (cid:22) k c /k s (cid:23) (value takes 3 when greater than 3) base-complexchord to remove, when n s < N s , go to Step 5 ; Remove the sheet/chord using the local parametrization , and use local regularization smoothing for thelocal step in the framework. If a valid mapping parameterization is not found or the quality metric isbelow the threshold, remove the next sheet/chord until a successful operation is performed. Otherwise,go to
Step 5 , an adaptive refinement will be performed when the Hausdorff distance ratio goes up to theuser-specified threshold r h ; If the specified threshold N s is not satisfied, go back to Step 1 , and when the number of elements issmaller than N c , perform adaptive refinement; After finishing the simplification process, perform a global optimization operation, return m out .After comparison with experimental data, we find that the collapsing order of base-complex sheets andchords has a significant effect on the final simplification results. Hence, we propose an optimized weightedranking approach for components removing based on the analysis of edge valence. All the base-complexsheets/chords are ranked with the valence error by minimizing an objective function of singularity struc-ture. With the proposed method, the singularity structure complexity of a hex-mesh decreases rapidly.Furthermore, a few close-loops and entangled sheets can be commendably eliminated, leading to a highsimplification rate. In addition, two extral ranking terms are adopted to maintain the elements quality andshape boundary. In the simplification, sheet refinement is performed to obtain a similar number of elementsas the target number N s . We propose an adaptive sheet refinement method based on the point-sampledHausdorff distance on surface, which can improve the hex-element uniformity and reduce the error betweenthe input and output hex-mesh geometry. To locally improve the uniformity and aspect ratio, we alsopropose a local regularization optimization in the parametrization for sheet/chord collapsing.
4. Coarsening operators on hex-meshes
In this section, we introduce two local coarsening operations on hex-meshes: the base-complex sheetcollapsing operation and the base-complex chord collapsing operation, which are two generalized conceptsto reduce singularity structure complexity of hex-meshes. The base-complex sheet collapsing operation ismainly applied to change singularities globally, and has a bigger impact on the boundary shape. The base-complex chord collapsing operation is used to optimize local singularity structure, especially for removing4dge pairs with a valence of 3 ∼
5. These two operations may introduce non-manifold and doublet configu-rations as shown in Fig. 2. Moreover, the collapsing operations may lead to local higher complexity whichshould be prevented. Hence, several filtering criteria will be proposed to avoid these problematic cases. f lr e l e r e new e l e r e p1 e p2 Main diagonal e new e p1 ' e p2 ' DoubletBreaking topology continuity
Doublet
Breaking topology continuity
Degenerated casesCollpasing
Figure 2: Left: Base-complex sheet collapsing operation and 2D degenerated cases; Right: Base-complex chord collapsingoperation and 2D degenerated cases. The red components may change the edge valence.
A base-complex sheet collapsing operation similar to [6] will be adopted here. Both sides of a sheet can befound by components, and then we remove the middle part of the base-complex sheet and preserve the sideof F L or F R . Finally, parametrization is employed to relocate these vertices within the β -ring neighborhoodregion ( β is set to 4 as [6]). Before sheet collapsing, several filtering criteria are used to detect whether itshould be put into the priority queue. Valence prediction . Edge pairs in F L and F R are collapsed into a single edge, and the correspondingedge valence may be changed. Generally, the valence of an inner edge is greater than two; otherwise, theadjacent elements will degenerate or form a doublet configuration (two hexahedra share two or more facesas in Fig. 2), which is forbidden in our framework. For the edge pair of e l and e r in a non-self-intersectionsheet, if the new edge is denoted as e n , then the valence of e n can be computed as follows: S after ( e l , e r ) = (cid:26) v ( e l ) + v ( e r ) − , P ( e l , e r ) = 0 v ( e l ) + v ( e r ) − , P ( e l , e r ) = 1 (1)where v ( e ) is the valence of a base-complex edge. For the base-complex face directly connects e l and e r , P ( e l , e r ) = 1 when it is on the boundary; otherwise, P ( e l , e r ) = 0. The base-complex face is either on theboundary or in the interior of hex-mesh. Boundary shape . The feature vertices/lines are extracted in the initialization stage, and in order topreserve sharp features, the sheet and chord contain sharp feature vertices are not allowed to be removed.Moreover, the base-complex sheet is not collapsed when the feature edges lie on base-complex edges. In thecollapsing operation, we use a similar way for hex-mesh sheet collapsing. Firstly, we find all elements forboth sides, then choose the temporary positions for vertex pairs. The topology element pairs in F L or F R areonly preserved in one side, then we remove all hexahedra between these two sides. In the optimization step,local parameterization [6] is adopted. The boundary shape error and interior distortion will be distributedto β -ring neighboring elements by solving min V E ( V ) with the SLIM approach [7].5 .2. Base-complex chord collapsing operation The base-complex chord collapsing operation is mainly used to optimize bad singularity structure locally.It only has effect on one column of base-complex components. Different from chord collapsing in hex-meshthat merging four vertices per group into a new position, Fig. 2 shows the 2D case of chord collapsing. Weextract two pairs of opposite base-complex edges, and merge them along the diagonal direction. Here thecollapsing direction is denoted as the main diagonal direction and the orthogonal direction along boundaryis referred as the sub-diagonal direction. If the number of elements in opposite edges is different, we willcollapse several sub-sheets before applying the collapsing.
Collapsing direction . The collapsing direction can be chosen in two directions, and the collapsefollowing these two directions will have quite different influence on singularity structure. The valences ofbase-complex edges in two sides along the main diagonal direction may be changed. Here, we only considerthe four groups of topology-parallel base-complex edges in the surface of chord following the direction ofdual string. We compute the predicting valence of these created base-complex edges, and obtain the valencedifference between the created edge and the regular edge. Our objective is to remove pairs with a valenceof 3 ∼ D v ( c ) = k (cid:88) i =1 ( | v ( e p i ) − p ( e p i ) − | + | v ( e p i ) − p ( e p i ) − | + | v ( e li ) + v ( e ri ) − min( p ( e li ) , p ( e ri )) − | ) , (2) D ( c ) = min( D v ( c ) , D v ( c )) , p ( e ) = (cid:26) , e ∈ E surface , e ∈ E inner where e p i and e p i are base-complex edges in the sub-diagonal direction, e li and e ri are in the main diagonaldirection as shown in Fig. 2, k is the number of contained components of the base-complex chord. We choosethe optimal collapsing direction by minimizing D ( c ). In our experiments, the chord collapsing operation isnot allowed when D ( c ) / k > .
9. In addition, we implement an easy-to-detect method in advance to improveefficiency. The four groups of parallel edges containing less than 2 groups are all singularities, which willnot remove the singular edges locally while collapsing. This kind of chord will not be pushed to the priorityqueues.The above operations are iteratively performed during simplification. Base-complex sheet collapsing canmake significant impact on mesh globally, but it is extremely difficult to remove self-intersection sheets withcomplex tangles and close-loop configurations without creating vertices with high valence. Base-complexchord collapsing is used to eliminate the entangled regions, and it contributes to improving the simplificationratio of sheets. Experimental results show that a higher simplification rate can be achieved by alternatelyperforming these two operations. ξ ξ ξ ξ t R t I t D X X I X X I X X X X ϕ φϕ (a) (b) Figure 3: (a) The mapping from a reference tetrahedron (left) to the origin shape (middle) and deformed shape (right). (b)The mapping from a hex element (left) to the element with ideal shape (right) of five terahedra. .3. Local parameterization for uniformity improvement After collapsing arbitrary sheets/chords, we apply a local parametrization [6] based on SLIM [7] torelocate points within the collapsing region. The framework of SLIM uses the local/global algorithm [29],and solves the distortion term globally while fixing the rotation as computed in the local step. In 3D case,the mapping from the original tetrahedral element to a deformed shape in a local orthogonal frame canbe denoted as a Jacobian, and the deformation can be expressed indirectly by a transformation from thetetrahedron with three orthogonal edges to both shapes as shown in Fig. 3. The mapping between thereference element to the original element is defined as ϕ : t R → t I , x I = W I ξ + x I , (3)where W I = ( X I − X I X I − X I X I − X I ) = x I − x I x I − x I x I − x I y I − y I y I − y I y I − y I z I − z I z I − z I z I − z I (4)is a constant matrix. Similarly, the mapping between the reference element and the deformed element is: ϕ : t R → t D , x D = W D ξ + x D . (5)Since W D and W I are affine matrices, finally the Jacobian φ of t I → t D can be denoted as: φ = ϕ ◦ ϕ − . (6)Our experiments show that adjusting the Jacobian of a transformation to the target shape in a local operationcan lead to an ideal mesh result after global simplification. In this paper, we also propose a local optimizationstrategy to move vertices within the collapsing region during parameterization. For edges in the collapsingregion, their length will be re-scaled while maintaining the element quality.Let M = ( V, K ) be the mesh of the parameterized region, V is the set of nodes, and K is the set ofconnectivity information, including nodes { i } and edges { i, j } . The discrete operator on M is defined as( Lv ) i = (cid:88) j ω ij ( v i − v j ) , (7)and the iterative form can be defined as v ki = N i (cid:88) j =1 w ij v j /N i , w ij = ω j / N i (cid:88) j =1 ω ij , ω ij = , v i ∈ V in , v i ∈ V bdy , v i ∈ F L ∪ F R (8)and the iteration is terminated when the threshold of variance (cid:15) is reached, (cid:2)(cid:80) i ( v ki − v k − i ) (cid:3) / (cid:2)(cid:80) i (cid:2) ( x k − i ) + ( y k − i ) + ( z k − i ) (cid:3)(cid:3) / < (cid:15) (9)where i and j are the vertex labels, N j is the number of neighboring vertices of the j th vertex, V in is theset of inner vertices (not including vertices in F L and F R ), and V bdy is the set of vertices on the boundary.
5. Weighted ranking for structure simplification
Many hex-mesh generation method such as octree-based and frame-field methods often yield unnecessaryinterior singularities. The resulting hex-mesh will have a large number of small components in base-complexsince the singular edges are distributed along the twelve edges of a cube-like component. The number of7ingularities can be progressively decreased by performing collapsing operations based on components, andthe simplified singularity structure is obviously different with various collapsing sequences. In this paper,we introduce a weight ranking sequence, which can choose the optimal candidate to remove iteratively. Theranking sequence aims to remove singularities within fewer iterative steps. We formulate this problem as anenergy minimization framework, and introduce a valence term related to the valence difference caused bycollapsing to achieve a rapid removal of singularities. In addition, optimization will be performed after eachsimplification step, and the distortion error caused by collapsing is distributed to neighboring elements andsheets. On the other hand, the collapsing operation is also under the constraints that the resulting elementsshould not be inverted and the max Hausdorff distance ratio r h should be kept. Hence, the sheet/chordremoval leading to less mesh distortion will have the collapsing priority. From this motivation, we alsointroduce two extra ranking terms, called the distortion term and the width term . In our framework, theranking function is a combination of the valence term, the distortion term and the width term, which is morerobust than the previous ranking method [6] only based on the thickness of base-complex sheets/chords. In the base-complex sheet ranking sequence, we combine the valence term , the distortion term and the width term as the normalized form [30]. The ranking function which can greatly improve the simplificationrate of base-complex components is defined as E s ( s ) = k sq (1 − e − E sq ( s ) ) + k sd (1 − e − E sd ( s ) ) + k sv (1 − e − E sv ( s ) ) (10)where k sv , k sd and k sd are weights of different ranking terms. In our implementation, the valence term E sd ( s ) has the biggest weight, i.e, k sv = 0 . k sd = 0 . k sd = 0 .
2. We also control the value of eachterm within (0 ,
2) to reduce the impact of the actual numerical size.
Valence term . The proposed weighted ranking algorithm for base-complex sheet collaping mainlyfocuses on the valence difference of singular edges during the simplification. It has been proved in [6] thatthe singularities of a hex-mesh will be progressively simplified within a finite number of iterations, andthe number of components will decrease while reducing the valence of singular edges. In this paper, wepropose an indirect energy function of valence difference between the current mesh and the mesh withoutsingularities. For the mesh with singular base-complex edges set S ( e ) = { e | e ∈ B E | e is singular } , theenergy function is defined as E ( m ) = (cid:88) e ∈ S ( e ) | v ( e ) − p ( e ) | . (11)Since the simplification process is based on two kinds of collapsing operations, and the singular edges are onlylocated in F L , F R and E M , then the energy function E ( m ) has a local representation on the base-complexsheet when it is collapsed, E ( m ) = n (cid:88) i =0 ( − (cid:88) e m ∈ E M | v ( e m ) − p ( e m ) | + γ (cid:88) e lr ∈ F L ,F R | v ( e (cid:48) lr ) − p ( e lr ) | ) , γ ∈ (0 , .
5] (12)where e m is the base-complex edge in E M of b i , e lr is the base-complex edge to be collapsed, and e (cid:48) lr is thecreated base-complex edge.According to the energy function E ( m ), some analysis on the structure of base-complex sheets can beperformed. The base-complex sheet has an interesting property: the interior edges which are topologyparallel to the dual face of sheet are all regular, the singular edges only exist in E M or F L and F R , and thecollapsing will introduce edges with a different valence. Hence, we can accurately predict the influence ofcollapsing.During a collapsing operation, the edges in the middle part will be eliminated. For a singular edge L si ,if the whole edge is contained in E M as L mid , then the value of E ( m ) will decrease. This type of eliminationis equivalent to creating new regular edges while collapsing. Moreover, the singularity structure will notchange when L mid is a part of a singular edge L si , and the type of collapsing does not affect the other part8 L si L si E M (a) b c1 c2 c3 F L F R (b) Figure 4: Distribution of singularities in E M , F L and F R is shown. The lines marked in green and black are regular edges,and all the other edges are singular. Two types of middle edges in E M are shown in (a), and four types of edge pairs on bothsides are shown in (b). of the singular edge and the base-complex faces extended from it. Such base-complex edges will not beconsidered in our valence calculation. Two types of L si are shown in Fig. 4(a).Since a singular edge is completely contained in F L or F R of one or more base-complex sheets, thecollapsing may remove the singular edges in both sides directly. Concerning the valence variation of edges inan edge pair of F L and F R , we have the following three cases which correspond to c1, c2 and c3 in Fig. 4(b)respectively: (c1) all the edges in F L and F R are regular, (c2) edges in only one side of F L or F R are irregular,(c3) both edges in F L and F R are irregular. In case of (c1), the valence of the created edge will be regular;in case of (c2), the created edge will have the same valence as an irregular edge, and it does not affect thesurrounding singularity configurations; in case of (c3), the valence of created edges will change, which meansthat the singularities of the rest part of hex-mesh will be changed, and the flow direction of neighboringbase-complex sheets might lead to different directions. Moreover, there are several configurations in case of(c3), the created edges might have different valences compared with base-complex edge pairs in F L and F R .The singular structure will be simplified when the valence difference between irregular and regular edgesdecreases. In contrast, the removal making the valence of singular edges higher should be avoided. Thecreated edges might not unknot self-interested sheets which are hard to remove, and it will greatly influencethe final component reduction ratio, and cause an early termination for simplification.To improve the convergence rate of E ( m ), we greedily select the base-complex sheet which can effectivelyreduce E ( m ) locally without introducing edges with higher valence. The valence term is defined as E sv = [ DM − β ( (cid:88) i T ( K maxi − K newi ) + (cid:88) i K mi )] /DM, (13)in which K newi = | v ( e newi ) − q ( e newi ) | ,K mi = | v ( e mi ) − q ( e mi ) | , T ( k ) = . k, k < . , k = 0 k, k > K maxi = max ( | v ( e li ) − q ( e li ) | , | v ( e ri ) − q ( e ri ) | ) , where e li and e ri form an edge pair, and they belong to F L and F R respectively, e mi is the whole singularedge in E M , DM is a large value to control the scale of this term, which is set as the maximum number of E M in the hex-mesh. In our experiments, β is set to be 1.67. For the purpose of minimizing the energy function,the convergence rate will be faster when the value of β ( n (cid:80) i =0 T ( K maxi − K newi ) + (cid:80) i K mi )) is much larger. E sv is a ranking term that encourages the collapsing candidate which could eliminate more singularities. Distortion term . The distortion term E sq is an optional term for hex-mesh with complex structure,where the sheet passing through the regions with dense singularities often contains patches with seriousdistortion. Removing these sheets can greatly improve the average value of Jacobians, and lead to a sig-nificant complexity reduction in simplification. Here we use the shape metric f shape of hexahedron [31] to9 igure 5: Two base-complex chords (red) in a toy mesh. The first chord is located in a patch near the feature edges (top right),and the second chord is located in the flat region (bottom right). The elimination of the first chord will lead to a significantboundary geometry error. The proposed geometric error term can prevent this kind of collapsing effectively. measure the sheet distortion. f shape = 1 if the hexahedron is a cube with parallel faces, and f shape = 0 if thehexahedron is degenerated, and f shape is a scale-invariant. In our paper, we obtain the central difference of f shape in each element for three parametric directions, and select the maximum difference as the differentialvalue of the hexahedron. From the experiments, we find that serious distortion happens when the differentialvalue is up to 0 .
55. In this term, we use the central difference of f shape to find the regions with distortion,and twist is more serious while the differential value is bigger. Since the local parameterization can improvethe element quality, removing regions with serious distortion in advance will increase the average value ofJacobians locally. E sq is defined as E sq ( s ) = ln( n (cid:88) i =1 f i + e ) − , (14) f i = (cid:26) , d ( i ) < . d ( i ) , d ( i ) ≥ . , d ( i ) = max ≤ j ≤ | f shape ( i + 1 , j ) + f shape ( i − , j ) − f shape ( i ) | where f shape ( i, j ) is of the neighboring element for the i -th element in the j -th parametric direction. Width term . The width term E sd in the weighted ranking function measures the width of sheet, whichprevents wrong collapsing since if the sheet is too wide and then the collapsing will lead to big distortionon the boundary geometry and affect the adjacent sheets seriously. Hence it is reasonable to remove sheetswith thin shape. For this term, we use the width of base-complex edges in E M , which is more accurate thanthe length between the vertex pair on surface. In our framework, E sd is defined by combining the averagewidth and the minimum length as follows, E sd = (cid:20)(cid:18) α a min ( v l ,v r ) ∈ P V d ( v l , v r ) + α b ¯ d ( v l , v r ) (cid:19) / ¯ L (cid:21) / (15)in which ¯ L is the average length of element edges, d ( v l , v r ) is the length of the base-complex edge connecting v l and v r , and the weights α a = 0 . α b = 0 . The base-complex chord collapsing only influences one column of components, which is used to adjustregions with many edge pairs having a valence of 3 ∼
5. From our observation, edge pair with a valenceof 3 ∼ E c ( c ), E c ( c ) = k sq (1 − e − E cq ( c ) ) + k sv (1 − e − E cv ( c ) ) (16)in which E cv is the valence term and E cq is the geometry error term.10 eometry error term . The chord collapsing operation often leads to simplification results with in-verted elements. We propose a simple strategy for priority processing on chords with narrow shape andsmaller length. The aspect ratio of a chord is defined as the ratio of the average length of the main diagonalto the sub-diagonal, which is applied to the measurement of thickness. To reduce the collapsing effect onboundary geometry, Gaussian curvature [32] is used to measure the shape error locally after collapsing. Inour implementation, we use the variance of curvature to find patches with significant curvature changes. Apatch may contain sharp features when its variance of curvature is large as shown in Fig. 5. The geometryerror term E cq ( c ) is defined as E cq ( c ) = ¯ LL ( c ) L ( c ) (cid:118)(cid:117)(cid:117)(cid:117)(cid:116) N v (cid:80) i =1 ( Q gi − ¯ Q g ) N v − L , L are the average length of the main diagonal and the sub-diagonal respectively, and Q gi isthe Gaussian curvature of a vertex on two sides. Valence error term . The valence error term measures the valence error of four topological paralleledges. To eliminate entangled sheets and simplify the local complexity, we require that the three topologicalparallel edges created by collapsing should be all regular. The ideal situation is that the valence error tendsto be zero. In our framework, the valence error is set as one of the optimization goals, E cv ( c ) = βD ( c ) / N b ( c ) . (18)In this step, the edges with high valence will not be introduced, hence the candidates will not be pushed tothe priority queue when D ( c ) / N b ( c ) > .
6. Sheet refinement
Sheet refinement is performed during the simplification pipeline in order to maintain the input meshgeometry with a similar number of elements to the user-defined target number. A similar method in [6] canbe used to split one element on a specific sheet into two elements along the direction perpendicular to theparallel edges. In this paper, we propose an adaptive sheet refinement method to improve the accuracy ofboundary geometry approximation.In our implementation, we find that choosing a sheet with the maximum width to refine is not a robuststrategy, where some boundary patches with large boundary approximation error may not be refined. In ourmethod, we firstly obtain the average length of all edges along the collapsing direction, and then computethe average Hausdorff distance ratio HR ( s ) by the means of point sampling for each sheet in the priorityqueue. According to the descending order of HR ( s ), the first four base-complex sheets will be selected inadvance, and the average length in the collapsing direction is denoted as ¯ L b . We choose one from the firstfour sheets to perform refinement if L b > . L ; otherwise, we refine the candidate with the maximum ¯ L b and meeting the above condition. During simplification, collapsing operations may fail frequently due tothe element quality and shape error constrains. In order to relax these constrains, we also perform therefinement process when a sheet collapsing fails. The base-complex sheets sharing F L and F R with theremoved sheet are selected as candidates. The refinement process narrows the parameterized region of failedsheets, such that it reduces the shape error by introducing more elements, and the sheet may be collapsedin the next iteration. In addition, another criterion is introduced to control the number of elements strictly.For the input hex-mesh with C elements, if the target number is C n before performing refinement, we checkwhether the number of hexahedra contained in a sheet is less than 1 . × ( C − C n ). This criterion caneffectively prevent some sheets being refined repeatedly.
7. Experimental results
We tested our algorithm on a four-core i7 processor with 8 GB memory. The maximal number ofiterations of the SLIM solver is set as 5, and we set r h = 1% (the threshold of Hausdorff distance ratio11 eighted rankingThickness ranking HR:0%
HR:0%MSJ:0.10ASJ:0.84 N I :0 HR:60% HR:80%
HR:64%MSJ:0.43ASJ:0.90N I :29 HR:80%MSJ:0.40ASJ:0.91N I :72HR:63%MSJ:0.38ASJ:0.90N I :65 HR:80%MSJ:0.38
ASJ:0.91N I :106Weighted rankingThickness ranking Weighted rankingThickness ranking (a) ASJ:0.93MSJ:0.35 (b)
O u r s [ G a o e t a l . 2 0 1 7 ]
The number of singular points
I t e r a t i o n s (c)
Figure 6: Simplification results of the fertility mesh with different complexity reductions, including our weighted rankingapproach and the thickness ranking method [6] are shown in (a). Our ranking method can effectively decrease the iterationsteps ( N I ) and improve the simplification results around regions with dense singularities as shown in the singular structurehighlighted with red circles. The top 4 candidates in each sequence are also shown when the simplification rates achieve 0%,60% and 80%. The simplification results are shown in (b), and the statistics of iterations are shown in (c). AVDR:0.37
MVDR:5.83 AVDR:0.16
MVDR:2.50
AVDR:0.35MVDR:2.90 AVDR:0.21
MVDR:1.98
AVDR:0.20MVDR:1.35
AVDR:0.36MVDR:3.47
AVDR:0.36MVDR:3.47
Figure 7: Simplification results of toy2 and lock. From left to right, the input meshes, results of thickness ranking [6] andour weighted ranking results are shown respectively. The color mapping shows the value of VDR, which illustrates that ourweighted ranking method can achieve a significant improvement on uniformity. that defined by the user, the simplification rate becomes larger when r h increases) and r | H | = 1 . H ), the number of base-complex components ( BC ) and theminimal, average and standard variance value of scaled Jacobians (MSJ/ASJ/Std). The boundary geometryerror is measured by the Hausdorff distance ratio (HR). For the experiments on the database given in [6], weperform the proposed method in 65% of meshes in this database. Most meshes achieve higher simplificationratio compared with [5] and [6], and the average simplification rate for these meshes is 88%. Weighted ranking candidates . Here we show some comparison results of the thickness rankingmethod [6] and the proposed weighted ranking method. In Fig. 6(a), we show the top 4 candidates in thefertility mesh when the simplification rates achieve 0%, 60% and 80% respectively. In the initial priorityqueue, our weighted ranking term can effectively pick up the base-complex sheets with serious distortionand close-loop configurations. Moreover, the number of singularities can also be reduced faster. For a12
SJ:0.93MSJ:0.24
ASJ:0.94
MSJ:0.09
R:18.41%
ASJ:0.87
MSJ:0.18
ASJ:0.96
MSJ:0.27
R:94.04%
ASJ:0.96
MSJ:0.14
R:89.36%ASJ:0.91
MSJ:0.20
Figure 8: Simplification results on meshes generated by the polycube-based method, gargoyle (left) mesh is generated by [33],and the stab (right) is generated by [17]. From top to bottom, the input hex-mesh, simplification results of thickness ranking[6] and our weighted ranking results are presented. For each example, we show the information of scaled Jacobian, singularitystructure, and the base-complex components with different colors. simplification rate of 60%, the thickness ranking method needs 65 iterations, and our proposed method onlyneeds 28 iterations. For the comparison results as shown in Fig. 6(a), when the simplification rates reach60% and 80%, our ranking algorithm can preferentially remove sheets to promote singular edge elimination,and the regions with dense singularities (marked with red circles) have been greatly improved. Comparedwith the simplification results by thickness ranking, regions with dense singular edges can be successfullyeliminated by our method, and self-intersected sheets can be removed as well at the same time. In thesimplification process, the distortion term is used to eliminate elements with poor shape quality, and tospread the distortion to neighboring elements while gradually improving the value of MSJ/ASJ in the hex-mesh. Our ASJ is better than thickness ranking during these three stages, and we can achieve 12 .
66% ASJimprovement over the input and 2 .
20% ASJ improvements over the simplification result by [6]. The averagerunning time of the entire dataset is 71 minutes, which is slightly slower than [6].
Element uniformity . In the proposed approach, we use local parameterization to improve the uni-formity of hex-mesh elements. We also propose a measurement of element uniformity called the volumedeviation ratio (VDR), which is denoted as the standard volume deviation of neighboring elements dividedby the average element volume. The range of VDR is (0 , ∞ ], and the uniformity is better while the value iscloser to 0 (for all elements with the same volume, VDR= 0). Compared with the thickness ranking method[6], our simplification results have 30 .
17% and 7 .
04% improvement in the average volume deviation ratio(AVDR) and the max volume deviation ratio (MVDR). In our experiments, the average AVDR and MVDRof meshes from polycube-base methods are 0 .
19 and 2 .
78 respectively, and the average AVDR/MVDR are0 . / .
54 in the simplification results of octree-base meshes. AVDR and MVDR gain 35 .
56% and 10 . SJ:0.80MSJ:0.16
ASJ:0.84
MSJ:0.03
MSJ:0.13
MSJ:0.43
R:99.47%ASJ:0.97
MSJ:0.43
R:99.47%ASJ:0.98
MSJ:0.29
Figure 9: The simplification results on meshes generated by octree-based methods, including the bimbia, deckel and bottlemodels. From left to right, the input mesh, singularity structure (the singular edges with a valence of 5 marked in green, and avalence of 3 marked in red, the valence of edges with other colors is > Input hex mesh Simplified resultModel
Bimba (Fig.9) 25,347 25,347 0.06 0.80 0.162 27,900 134 0.43 0.97 0.049 0.95 99.47 103.48Bottle (Fig.9) 35,886 35,860 0.13 0.79 0.167 34,558 266 0.22 0.98 0.054 0.91 99.26 200.97Deckle (Fig.9) 53,658 53,116 0.03 0.84 0.187 53,680 806 0.10 0.95 0.082 1.00 98.48 793.93Fertility (Fig.6) 21,370 20,840 0.10 0.84 0.150 21,016 310 0.32 0.94 0.079 0.87 98.51 153.83Toy1 (Fig.5) 18947 18883 0.12 0.81 0.161 15784 144 0.51 0.96 0.059 0.66 99.23 48.53Toy2 (Fig.7) 14,288 14,288 0.15 0.81 0.158 13,952 129 0.49 0.96 0.059 0.90 99.10 48.59Lock (Fig.7) 28,753 25,720 0.01 0.80 0.244 28,501 2,990 0.17 0.93 0.109 0.91 88.37 381.46Eight (Fig.11) 4,571 3,867 0.17 0.78 0.155 5,428 43 0.53 0.92 0.065 0.69 98.89 7.53Bone (Fig.11) 2,751 2,520 0.15 0.78 0.159 2,484 37 0.69 0.93 0.069 0.75 98.53 4.24 examples are shown in Fig. 7 with the VDR colormap.
Simplification of hex-mesh from polycube-based methods . For hex-meshes generated by polycube-based methods [17, 33, 21], the singularity structures are completely distributed on the surface, and thedistribution of singular edges is sparse. Hence, the valence term has a small effect, and the weights k sd and k sv are set to a smaller value ( k sd = 0 . k sv = 0 . .
95, and the meshesobtain 30 . / .
04% improvement for AVDR and MVDR compared with [6], respectively. Moreover, theaverage components reduction ratio is promoted to 71 . Simplification of hex-mesh from octree-based methods . Octree-based hex-meshing approachesoften generate a complex structure with dense local singularities. In [6], the greedy collapsing by thicknessranking was utilized under a set of filters. It can not find a coarser structure in the hex-mesh with a largenumber of interior singularities and kinking, since the thickness ranking term does not have a direct effect onsingularity removal. The corresponding simplification [6] has a slow convergence rate, and it can achieve anaverage simplification rate around 86% for the hex-mesh database. Instead, our weighted ranking methodcan obtain a much simpler singularity structure with much fewer base-complex components. The averagesimplification rate in the proposed framework can increase 93 .
56% with respect to the initial number of base-complex components in the input hex-mesh, and gain 7 .
40% improvement compared with [6]. Moreover,14
SJ:0.95
MSJ:0.18
MSJ:0.03
R:97.33%
ASJ:0.93MSJ:0.44
ASJ:0.86
MSJ:0.11
Figure 10: From left to right, the input octree-based hex-mesh, simplification results of [6] and our results. For each example,we show the scaled Jacobian, singularity structure, and base-complex components with different colors.
ASJ:0.92MSJ:0.53
Figure 11: Simplified results on octree-based meshes [34], and their singularity structures are similar to polycube-based meshes. in the proposed framework, adaptive refinement is performed during the simplification process, which caneffectively maintain the quality of boundary geometry and promote the simplification process under theconstraint of r h . Our ASJ/MSJ achieves 0 . / .
32, and gain 14 .
02% ASJ improvements over the thickness15 able 2: Comparison with [6]. H is the number of hex-elements, BC is the number of base-complex components, Std isthe standard deviation of the scaled Jacobians, HR stands for the Hausdorff distance, and R is the simplification rate. Model
Input 21,167 25,669 0.20 0.91 0.907 0.11 0.64Thickness ranking 22,524 805 0.14 0.96 0.068 0.30 3.52 0.98 89.36 30.67Gargoyle (Fig.8) Weighted ranking 23,352 451 0.27 0.96 0.071 0.20 3.47 0.92 94.04 41.22Input 84,489 2,227 0.18 0.87 0.130 0.05 1.29Thickness ranking 80,295 1,817 0.09 0.95 0.069 0.10 2.74 0.77 18.41 46.89Stb (Fig.8) Weighted ranking 83,678 819 0.24 0.93 0.092 0.10 1.35 0.97 63.22 60.32Input 16,608 16,487 0.11 0.86 0.139 0.18 2.69Thickness ranking 10,278 636 0.44 0.93 0.081 0.38 1.77 0.99 96.14 32.25Rocker (Fig.10) Weighted ranking 10,790 441 0.35 0.93 0.088 0.26 1.75 0.99 97.33 50.55Input 13,987 13,987 0.02 0.79 0.168 0.46 8.02Thickness ranking 10,704 2,305 0.23 0.92 0.102 0.38 3.89 0.99 83.52 31.24Pig (Fig.10) Weighted ranking 11,218 876 0.18 0.95 0.086 0.23 3.02 0.99 93.74 38.69Input 4,247 3,640 0.03 0.82 0.159 0.18 0.51Thickness ranking 2,868 580 0.24 0.90 0.117 0.36 2.43 1.00 84.07 14.57Bird (Fig.10) Weighted ranking 2,935 278 0.25 0.90 0.127 0.23 1.45 0.95 92.36 16.54Input 19,075 18,355 0.13 0.85 0.151 0.28 3.20Thickness ranking 17,680 691 0.44 0.95 0.070 0.33 4.36 0.98 96.24 113.67Buste (Fig.10) Weighted ranking 16,336 158 0.28 0.96 0.065 0.23 2.82 0.97 99.14 53.55 ranking method. Some simplification results are shown in Fig. 9, and statistics are presented in Table 1.Comparison examples with [6] are also presented in Fig. 10 and Table 2.More importantly, octree-based meshes can be simplified into a similar singularity structure as polycubemeshes. As shown in Fig. 11, singularities were mainly distributed on the boundary. The simplification rateachieves 98%, and the interior singular edges are eliminated completely.
8. Conclusion and future work
In this paper, an improved singularity structure simplification method of hex meshes is proposed basedon a weighted ranking function, which is a combination of the valence prediction function of local singularitystructure, shape quality metric of elements and the width of base-complex sheets/chords. Local optimiza-tion and adaptive sheet refinement are also proposed to improve the element quality of simplified hex-mesh.Compared with the thickness ranking method, simpler singularity structure with fewer base-complex com-ponents can be achieved by the proposed weighted ranking approach while achieving better mesh qualityand Hausdorff distance ratio. The proposed approach has a few limitations. Sharp features can not bepreserved very well on the boundary, and the boundary approximation error may increase in models withhigh genus. Possible solution might be a more strictly feature edge extraction and vertex mapping. In thefuture, we will apply the proposed hex-mesh simplification method to volume parameterization, which is abottleneck in isogeometric analysis.
References [1] Y. Zhang, C. Bajaj, Adaptive and quality quadrilateral/hexahedral meshing from volumetric data, Computer Methods inApplied Mechanics and Engineering 195 (9-12) (2006) 942–960.[2] Y. Ito, A. M. Shih, B. K. Soni, Octree-based reasonable-quality hexahedral mesh generation using a new set of refinementtemplates, International Journal for Numerical Methods in Engineering 77 (13) (2010) 1809–1833.[3] X. Bourdin, X. Trosseille, P. Petit, P. Beillas, Comparison of tetrahedral and hexahedral meshes for organ finite elementmodeling: an application to kidney impact, in: 20th International Technical Conference on The Enhanced Safety ofVehicle, 2007.[4] A. C. Woodbury, J. F. Shepherd, M. L. Staten, S. E. Benzley, Localized coarsening of conforming all-hexahedral meshes,Engineering with Computers 27 (1) (2011) 95–104.[5] X. Gao, Z. Deng, G. Chen, Hexahedral mesh re-parameterization from aligned base-complex, ACM Transactions onGraphics 34 (4) (2015) 142.[6] X. Gao, D. Panozzo, W. Wang, Z. Deng, G. Chen, Robust structure simplification for hex re-meshing, ACM Transactionson Graphics 36 (6) (2017) 185.
39] S. H. Liao, R. F. Tong, J. X. Dong, F. D. Zhu, Gradient field based inhomogeneous volumetric mesh deformation formaxillofacial surgery simulation, Computers & Graphics 33 (3) (2009) 424–432.[40] X. Fu, C. Bai, Y. Liu, Efficient volumetric polycubemap construction, Computer Graphics Forum 35 (7) (2016) 97–106.[41] Y. Li, Y. Liu, W. Xu, W. Wang, B. Guo, All-hex meshing using singularity-restricted field, ACM Transactions on Graphics31 (6) (2012) 1–11.39] S. H. Liao, R. F. Tong, J. X. Dong, F. D. Zhu, Gradient field based inhomogeneous volumetric mesh deformation formaxillofacial surgery simulation, Computers & Graphics 33 (3) (2009) 424–432.[40] X. Fu, C. Bai, Y. Liu, Efficient volumetric polycubemap construction, Computer Graphics Forum 35 (7) (2016) 97–106.[41] Y. Li, Y. Liu, W. Xu, W. Wang, B. Guo, All-hex meshing using singularity-restricted field, ACM Transactions on Graphics31 (6) (2012) 1–11.