PProcedural Urban Forestry
TILL NIESE,
University of Konstanz
SÖREN PIRK,
Google Brain
MATTHIAS ALBRECHT,
University of Konstanz
BEDRICH BENES,
Purdue University
OLIVER DEUSSEN,
University of Konstanz
Fig. 1. Steps of our learning-based plant population method: we use satellite images (a) and predict coverage maps for vegetation (b). We use these maps toidentify plant regions for reconstruction (c) and to learn parameters for our procedural models when populating new virtual cities with complex plants (d),which significantly increases the realism of urban landscapes (e).
The placement of vegetation plays a central role in the realism of virtualscenes. We introduce procedural placement models (PPMs) for vegetationin urban layouts. PPMs are environmentally sensitive to city geometry andallow identifying plausible plant positions based on structural and functionalzones in an urban layout. PPMs can either be directly used by defining theirparameters or can be learned from satellite images and land register data.Together with approaches for generating buildings and trees, this allows usto populate urban landscapes with complex 3D vegetation. The effectivenessof our framework is shown through examples of large-scale city scenesand close-ups of individually grown tree models; we also validate it by aperceptual user study.CCS Concepts: •
Computing methodologies → Shape analysis ; •
The-ory of computation → Grammars and context-free languages ; Rewritesystems.Additional Key Words and Phrases: Urban Models, Vegetation, ProceduralGeneration, Urban Forestry
The visual simulation of urban models and generation of their 3Dgeometries are important open problems in computer graphics thathave been addressed by many approaches. Existing methods rangefrom façades, buildings, city block subdivisions, to entire cities with
Authors’ addresses: Till Niese, University of Konstanz; Sören Pirk, Google Brain;Matthias Albrecht, University of Konstanz; Bedrich Benes, Purdue University; OliverDeussen, University of Konstanz. viable street and road systems. Synthetically generated city mod-els already exhibit a high degree of realism. However, cities areimmersed in vegetation, but only very little attention was dedi-cated to the interplay of urban models and vegetation in computergraphics. Ecosystem simulations have been considered by manyapproaches. The prevailing algorithms use plant competition forresources as the main driving factor of their evolution either on thelevel of entire plants [15] or on the level of branches [39]. Unfor-tunately, these approaches fail in urban areas, because urban treeshave only limited space available to compete for resources, and theyare heavily affected by surrounding urban areas as well as by humanintervention.The term urban forest refers to vegetation in urban areas [43].Vegetation has many practical functions: it limits and controls airmovement, solar radiation, and heat, humidity, and precipitation, itcan also block snow and diminish noise. Moreover, an importantfunction of vegetation is to increase city aesthetics. Urban forests arenot planted at once but managed over time. Dead trees are removedand new trees are planted. Living trees are pruned for visibilityor utility services. In contrast to real cities, we face a differentsituation in computer graphics. An existing algorithm generates acity model without vegetation and we need to find suitable locationsfor individual trees. Simulating urban forest evolution, i.e., by usingthe algorithm by Benes et al. [5], is time-consuming and difficult tocontrol. a r X i v : . [ c s . G R ] A ug • Niese, T. et al. In this paper, we introduce a procedural method for the advancedplacement of vegetation to increases the overall realism of urbanmodels. We are inspired by urban rules that control which treesand bushes can be planted where and how tall they are allowedto grow. These rules vary for individual areas; they are relaxed inindustrial zones, people also have more flexibility in their properties,but they are enforced in public zones of a city and around importantlandmarks. We, therefore, introduce procedural placement models –strategies for generating plant positions – along with parametersto enable an automatic placement of vegetation, faithful to thecharacteristic features of plant distributions within the differentmunicipality zones of a city. We show that placement models andparameters together provide an efficient means of controlling theinteractive modeling of urban landscapes.Moreover, we can populate city models not just with static treegeometry, but with dynamic models of plants that can grow andchange their shape in response to environmental changes or humanintervention. This allows us to apply simulation models that describehow a city, or its areas would change if more or less effort could bespent on maintaining them. Having such dynamic urban ecosystemsallows users to visually predict and control the effects of gardeningin a city and also helps to make such models more realistic sincethey inhibit decay and different levels of order.While procedural placements can be used directly to populateurban layouts, we also show that placement models can be usedto learn plant distributions of real cities. We use satellite imagesand land register data to train deep neural networks to learn thedistributions of trees and other plants in the parameter space ofour procedural placement models. While placement models act as astrong prior to regularize finding plausible placements, learning pa-rameter values also enable users to efficiently author scenes throughintuitive parameters. The example in Fig. 1 shows a satellite image(a) and the predicted coverage map (b). We use coverage maps toidentify areas where to place vegetation (c) and to learn parametersof our procedural models. Once the parameters are obtained we canautomatically populate city models with complex models of plants(d) to increase their realism (e).Our main contributions are: (1) we advance the state-of-the-artin modeling vegetation in urban landscapes by introducing a proce-dural modeling framework that is based on the idea to factorize thecomplexity of plant placement into manageable components; (2) weintroduce a set of procedural placement models along with theirparameterization to capture a large variety of placement patterns;(3) we use a novel pipeline for learning plant distributions in citiesfrom satellite data; we convert satellite images into coverage mapsand then learn the placement parameters of our procedural models.
Only recently researchers started exploring approaches to model vir-tual environments with realistic traits of real urban landscapes [65].Here, we focus on the involved aspects of plant and urban modeling,ecosystems as well as learning-based methods.
Urban Modeling: urban structures are often modeled procedu-raly [79]. In their seminal paper, Parish and Müller [54] used L-systems to model complex cities and Wonka et al. [82] applied split grammars to procedurally define buildings that were later extendedby using subdivision [46] and by more advanced operations [64].Purely procedural models of infinite cities were introduced by Mer-rell and Manocha [41, 42], the procedural modeling of street layoutshas been described by using vector fields [9]. Similarly, proceduralapproaches have been successfully applied to modeling façades [47].Urban modeling has been combined with urban simulation to gen-erate viable cities [72, 73], and city growth [80].
Inverse Procedural Modeling: our approach is related to in-verse procedural models in that it learns plant placement from realcities and attempts to transfer it to synthetic ones by fitting parame-ters of a procedural model. An inverse procedural model for façadeshas been introduced by AlHalawani et al. [1] and Wu et al. [83],variations from a procedurally encoded single layout can be gener-ated by the work of Bao et al. [3], the layered nature of façades hasbeen used for inverse procedural modeling in [26, 34], exploitingstructural symmetries was done in [12], and interactive alterationsof shape grammars were utilized in [13]. Buildings can be encodedas L-systems by using the inverse procedural approach from [71],modeled by using a procedural connection of structures [8], orthrough binary integer programs [29]. Finding the parameters ofprocedural models from existing data was investigated by Taltonet al. [68] who used expressions of L-system strings of modules tofit a generated structure to an input. Ritchie et al. [61] attempt tocontrol procedural programs and procedural models using stochas-tic Monte Carlo methods. Structural patterns can be encoded byusing the approach of Yeh et al. [86] or encoded as L-systems by thework of Šťava et al. [66]. Recently, trained deep neural networkshave been combined with inverse procedural modeling to allow forthe interactive design of buildings by using sketches [49]. Inverseprocedural modeling has also been used to generate entire urbanlayouts in [40, 74].
Plant Modeling: research has long focused on defining plausiblebranching structures based on fractals [2, 52] or L-Systems [35, 60].Other methods focus on rule-based modeling [36], inverse proce-dural modeling of trees [66, 67], and finding L-system for branchingstructures [22]. Moreover, sketch-based modeling techniques allowartists to produce plant models interactively and in more nuancedways [25, 51, 81]. Alternative approaches attempt to reconstructplant models automatically either from images [69, 70], videos [33],or scanned 3D point clouds [37, 85]. Only just recently, several ap-proaches also focus on the dynamic and realistic behavior of plantmodels, including growth [38, 57], the interaction with wind orfire [56, 58], or as established through realistic materials [77].Modeling the plants’ response to its environment is of utmostimportance to obtain realistic branching structures when positionedin groups or alongside obstacles [48]. Approaches exist to modelthis phenomenon by considering the self-organization of plants [53,63], through explicitly modeling the plasticity of branches [59] orthrough the dynamic adaptation to support structures, as can beobserved for climbing plants [6, 23]. The growth, decay, and pruningof buds and branches play an eminent role in plant development [14];a phenomenon that is often parameterized in procedural models todevelop convincing branching structures [67].
Ecosystems: various works focus on ecosystem simulation. Theseminal paper of Deussen et al. [15] introduced a competition for rocedural Urban Forestry • 3 resources on the plant level, this approach has been recently ex-tended towards the competition of individual trees in layered ecosys-tems [39]. Various approaches attempt to simulate ecosystems con-sidering different phenomena, such as erosion [11] or even by lo-cally learning plant distributions and using them as interactivebrushes [16, 19]. Close to our approach is the work of Benes etal. [5] that models urban ecosystems by combining wild ecosystemgrowth from [15] with controlled plant management. However, con-trary to our work, the initial plant placement is purely ad hoc andtheir approach does not allow for procedural plant placement thatcould be connected with real cities.
Learning-based Approaches: some works have started to ex-plore the capabilities of learning-based methods for scene gener-ation and object placement. While neural networks have shownparamount performance on image classification, synthesis [31, 84],or inverse texture modeling [24] tasks, properly placing objectsinto meaningful configurations is still a challenging problem. Forarranging scenes, methods need to coherently generate plausibleand continuous poses (translation and orientation) of objects andto one another. However, most neural network architectures onlyallow to operate on fix-sized in- and outputs, which makes placingarbitrary numbers of objects challenging. To this end, Ritchie andWang [62], as well as Wang et al. [78], propose methods for scenegeneration based on convolutional neural networks, while Zeng etal. [87] learn to reconstruct buildings by learning parameters of aprocedural model. For outdoor scenes Guerin et al.[21] and Kelly etal. [30] use generative adversarial networks to author textures forterrain and building details.While these methods only tangentially relate to our work, theyshow the capabilities of neural networks for scene generation. Simi-larly to these methods we combine the advantages of image-basedlearning techniques with procedural modeling. In particular, weaim at learning the parameters of procedural models with neuralnetworks that allow us to realistically place plants.
Generating plausible vegetation models for virtual urban landscapesfaces two major challenges: first, plant placement varies acrossdifferent functional and demographic zones (Fig. 2a)– an industrialzone may only have a small number of non-managed plants, whileresidential areas not only have regularly placed trees alongsideroads but also in gardens and parks. The planting rules dependon culture, habits, city rules, etc. and thus are difficult to quantify.Second, plant models need to simulate growth and interaction withtheir environment to generate vegetation with high visual fidelity.Moreover, urban trees are often pruned or may lack resources (wateror light) which hinder their growth and affect their structure.To address these challenges, we propose a two-stage proceduralmodeling pipeline. First (Fig. 2), we introduce PPMs (b) to generateplausible plant positions based on placement strategies and knownplanting rules for vegetation. A PPM can be defined for each func-tional or demographic zone of a city ( e.g., residential, commercial, orindustrial) and operates on single lots of land (realty). Each PPM hasa different set of rules parameterized by structural and positionalparameters that allow us to capture the various kinds of planting
Fig. 2. To place vegetation in urban environments we propose proceduralplacement models (b) that implement placement strategies for vegetationbased on the geometry of individual lots, positional parameters, and azone identifier (a). After plant positions (c) have been generated we usea developmental model (e) along with structural parameters (d) to jointlygrow plants, which results in realistic 3D plant models. patterns found in real cities. Second, once the plant positions aregenerated, we use a state-of-the-art developmental model (Fig. 2, e)for growing plants. Given the plant’s location and environment thegrowth process generates unique and realistic branching structures.Finally, we have developed a novel learning-based pipeline forpopulating models of real cities with vegetation. First, we convertsatellite images of urban landscapes to vegetation coverage maps by using a style-transfer networks (Fig. 14, b). The coverage mapsrepresent areas that are covered with above-ground vegetation. Sec-ond, we learn a mapping from the coverage maps to the parametersof our PPMs (Fig. 2, d). Given our pipeline and the parameter val-ues obtained from real satellite images, we can generate vegetationsimilar to what can be observed in the satellite images.
Landscapes are defined by road networks and administrative orfunctional zones [76] and they can be further classified into rural,exurban, suburban, and urban areas [43]. All the involved plantstogether form the urban forest, an umbrella term referring to trees,shrubs, and bushes found in urban and suburban areas.The most common way of adding a tree into an urban forest isby replacing a dead tree. It is quite uncommon that large areas aredirectly populated by vegetation, such a situation occurs usuallyonly in newly created developments. When a new neighborhood isbuilt, a city would plant regularly spaced trees and bushes parallelto roads and sidewalks by applying municipal tree ordinances [20](see also [43, pg 254]). The neighborhood is subdivided into blocksand blocks into individual lots that are left to the owners to plantthe vegetation as needed. Typically, the city only defines certainplanting rules such as the distance between individual trees shoulddepend on the tree height, or the distance is derived from the soilthe tree requires to survive [17]. Trees should not obstruct viewsat intersections, they should have a certain distance from the curb,and sidewalks [7]. Vegetation must not block house entrances foremergency purposes. These functional restrictions are also com-bined with aesthetic constraints: vegetation should not be plantedin the proximity of windows [43]. Most of these rules are combinedinto a so-called building activity area (or building envelope) that isan extension of the 2D projection of the building by about 600cmperpendicularly from each wall of the building and 150cm from eachdriveway.At the lowest level, we create procedural placement models thatseek to position all trees at once. Implementing the above-describedspacing rules results in vegetation that is regularly placed along • Niese, T. et al. roads and sidewalks and that follows a so-called Poisson-disc dis-tribution (minimal distance requirement) among plants and fromthe building envelope. We further expand the building envelope toconsider aesthetic criteria such as planted vegetation should notobstruct views from windows.At a higher level, we aim to procedurally generate vegetation forthe various types of zones. Therefore, we assume that each urbanlayout, either real or synthetically generated, can be divided intosuch zones. Specifically, we use a zonal layout commonly used inurban planning [75, 76] and urban simulations [72, 73, 80] and dividean urban layout into five zones: 1) residential includes houses andbuildings where people live, 2) commercial consists of businessessuch as department stores, malls, and small stores, 3) industrial zones include factories and other production services, 4) street zones,which describe areas next to roads. Additionally, we add a fifthcategory (5) other that includes parks, non-managed areas, areasclose to railroads, unassigned areas, etc.As shown in Fig. 3 we further assume that a city layout is orga-nized as individual lots, where each lot represents a property thatmay be occupied by a building. Given a lot and its zone type wethen define a PPM that places vegetation individually into each lot.
Vegetation for an urban landscape is generated in two steps: firstwe apply a PPM to seed plants individually for each zone accordingto their functional types. After the plants have been planted, weuse a developmental model that dynamically grows the plants intheir locations while interacting with the surrounding environment.This allows plant adaptation to its environment such as bending andshedding of branches due to the competition for resources, resultingin vegetation with high visual fidelity.
Our goal is to model the variance of plant morphology and plantplacement across municipality zones to realistically distribute vege-tation in an urban landscape. Defining and parameterizing rules forobtaining plausible plant positions, while at the same time adheringto urban features such as buildings and streets, is intractable. There-fore, we factorize the problem into specifying placement models forthe different zones from Sect. 4 (industrial, commercial, residential,street, and other) and for each individual lot (Fig. 4).This factorization allows us to define a more manageable param-eterization along with placement strategies for the different zones.Each placement model defines a concise strategy to place vegetationinto a single lot. For example, we have models to place vegetationrandomly, along the edges of a lot, equidistantly, etc. Moreover, wedefine the PPMs in a context-sensitive way. This means to maintaina global appearance, a PPM can query adjacent lots to adjust itsparameters ( e.g., the distance between trees alongside a road in onelot should be the same in the neighboring lot). A PPM is defined asthe tuple M = (cid:10) S д , P p , P s (cid:11) , (1)where S д is a function implementing a placement strategy (rules)with д ∈ { R , B , C , E , S , I } (see Sect. 5.2 and Tab. 2), P p is a set of positional parameters to define the placement of plants, and P s is Fig. 3. Urban layout: satellite images (left), zone data for individual lots(middle), and coverage maps (right) are available in public datasets. We usezone data and lot geometry as inputs to our procedural models and learn topredict their parameter values from the coverage maps.Fig. 4. Placement strategies illustrated for a single lot: random (a), bound-ary (b), cluster (c), equidistant (d), single (e), regular (f). While the PPM onlyplaces plant positions within a lot, the resulting plants can grow over itsboundaries. For the strategies random, boundary, cluster, and single yellowareas indicate where a plant can be placed (active areas). Circles representplant positions with their radius. (e), (f): lots and buildings are representedas polygons: P L , P b , P b . To identify the area within a lot that can be usedto place vegetation we subtract building polygons from the lot polygon,resulting in the final polygon P (g). a set of structural parameters for the morphological appearance ofvegetation within the lot.Lots and buildings are defined as 2D polygons (possibly concaveand with holes): P L = { V L , E L } , P H = { V H , E H } , where V L and V H denote the vertices of a lot ( L ) and buildings ( H ) and E L and E H the edges of the polygon for lot and building, respectively. A lotcan include multiple buildings (or other structures): U = { P iH } . Thepolygon P = P L − P ib , ∀ P ib ∈ U , defines the area of a lot that can becovered by vegetation (Fig. 4, e-g); the PPM only places vegetationwithin the geometric shape of the polygon P . A set of plant positionsfor a single lot is then generated as X = S д (V p , V s , P , Z , K) , (2)where V p and V s denote the parameter values for positional P p andstructural P s parameters, P is the polygon of a single lot, Z is a zoneidentifier, and K is the context of a lot. We use Z to select parametervalues for a lot. For example, a residential and a commercial lot mayuse the same strategy ( e.g., boundary), but differ in their parametervalues ( e.g., different species are used).This is illustrated in Fig. 5.Generating vegetation with the same value for Z produces a uniform rocedural Urban Forestry • 5 Fig. 5. Given a lot, we use a placement strategy to define the placementof vegetation. The zone identifier Z is used to select parameter valuesfor structural V s and positional parameters V p . Together, strategies andparameters allow us to generate vegetation with globally similar appearancedepending on the municipality zones within a city. appearance (the same settings are used for every lot), while varying Z with the functional zones generates a diverse and yet coherentappearance. Put differently, Z allows us to control the placementof vegetation on a global scale. Finally, we use K to modify theinput parameters according to the neighbors of a lot to allow forconsistent global appearance as detailed in Sect. 5.5.To summarize: a PPM defines a placement strategy along withstructural and positional parameters for populating single lots. Vary-ing the values of these parameters generates different plant positionswithin the constraints of the strategy at a local scale, while changingthe parameters jointly – e.g., based on zoning types – allows us tovary vegetation at a more global scale. A placement strategy д ∈ { R , B , C , E , S , I } defines rules for placingthe plants and how the parameters are used. Specifically, we definethe following strategies: (1) random placement within an entire lot,(2) along lot boundaries , (3) clustered distribution, (4) equidistant along the medial axis of a lot, (5) placement of a single plant, and(6) regular placement within an entire lot.To implement the different strategies, we compute active areaswithin each lot that define where the vegetation can be placed.For the strategies random and single the entire lot polygon is used,while for the strategies boundary and cluster we define active areaswithin the polygon; i.e., we define a boundary along the edge ofthe polygon towards its center for boundary and a circular areaaround a randomly selected point within the polygon for cluster . For equidistant , we compute the medial axis of the polygon and thengenerate equidistant plant positions along the axis. The strategy single defines the random placement of a single plant within theentire lot. Finally, for regular we compute a lot-aligned lattice andplace plants at the center of each cell. Fig. 6 shows four of our sixplacement strategies and their parameter variations. The placement strategies are parameterized by the positional param-eters shown in Tab. 1. For the placement strategies random , bound-ary , and cluster we use a method called Variable Radii Poisson-DiskSampling [45] to generate plant positions within active areas of a Fig. 6. Variations of positional parameters on a single lot with differentplacement strategies. (a)-(c): strategy boundary with narrow (a) and wide(b) boundary size, and less density (c). (d)-(f): strategy cluster with a singlecluster (d) and multiple clusters (e) of different sizes (f). (g)-(i): strategy regular with no (g), medium (h), and high (i) jitter. (j)-(l): strategy random with low (j), medium (k), and high (l) density. lot. More specifically, we are interested in generating a set of points X with spatially varying point density. A new position sample y isassigned a radius r ( y ) : Ω → N( µ , σ ) , where N denotes a normaldistribution with mean µ and variance σ . The new position sample y is accepted and added to the set if | y − x | ≥ r ( x ) + r ( y ) ∀ x ∈ X .For the boundary placement strategy we further define the bound-ary size as parameter β . We use β to define an area along the normalof the edge of a polygon towards its center. To implement the cluster strategy, we randomly sample a number of points in a lot and definethe cluster area as a circle with a radius κ . A lot can have a variablenumber of clusters with the maximum number defined by π . Forboth strategies, boundary and cluster , we first compute the activeregions (boundary, cluster circles) before generating sample posi-tions. For the strategy single we sample one position somewhere inthe lot.To allow for more regular vegetation placement we define thestrategies regular and equidistant . For strategy regular , we computea regular lattice based on the bounding box of a lot and define thesize of cells with ω and their orientation with η . We optionallyjitter the positions using ψ within each cell. To implement strategy • Niese, T. et al. Table 1. Positional and Structural Parameters for PPMs
Parameters Meaning Range/Dimensions P o s i t i o n a l µ Tree envelope mean [1m - 10m] σ Tree envelope variance [0.1 - 2] τ Vegetation density [0-1] β Boundary size [0m - 5m] κ Cluster radius [1m - 20m] π Max number clusters [0 - 5] ω Regularity grid size [5m - 50m] ψ Regularity jitter [0 - 1] η Regularity orientation [0 - 180°] δ Equidistant spacing [0m - 10m] ξ Radius of context [0m - 300m] S t r u c t u r a l α Max plant age [0 - 100 years] ρ Tree vs shrub ratio [0 - 1] θ Species diversity [0 - 1] γ Pruning factor [0 - 1] λ Num. species [1 - 10]
Table 2. Placement Strategies and used Positional Parameters.
Strategy Symbol µ σ τ β κ π ω ψ η δ ξ
Random R ✓ ✓ ✓ ✓
Boundary B ✓ ✓ ✓ ✓ ✓
Cluster C ✓ ✓ ✓ ✓ ✓ ✓
Equidistant E ✓ ✓ ✓ ✓ ✓
Single S ✓ ✓ ✓ ✓
Regular I ✓ ✓ ✓ ✓ ✓ ✓ ✓ equidistant we first compute the medial axis of the lot polygon P L [10] and then use it to equidistantly place plant positions alongthe axis based on the distance parameter δ . We model the density ofvegetation for all placement strategies by defining the parameter τ ,which deactivates position samples in X . A value of τ = τ ≤ τ = ξ for the context K of a lot. The context is defined as the adjacent lotsand we use it to model context-sensitivity (see Sec. 5.5).The positional placement of plants should also account for theplanting rules discussed in Sect. 4 i.e., trees should not be too close tobuildings and should not obstruct doors and windows. We adoptedthe concept of building envelopes [43] that defines the distancesfrom the buildings. Moreover, we extend the envelope in front ofdoors and windows to avoid their blockage. An example in Fig. 7shows the effect of using the building envelope.Tab. 1 summarizes all positional parameters along with theirranges, while Tab. 2 shows our placement strategies along with thepositional parameters that are used for each of them. Examples ofchanging the values of positional parameters are shown in Fig. 6. We define structural parameters to model the morphology of indi-vidual trees as well as the plant population within a lot. Based onthe computed plant positions we define a plant seed as the tuple T = ⟨ p , α , ϕ , γ ⟩ , (3)where p ∈ X is the plant position, α its maximum age, ϕ denotes aspecies identifier, and γ is a pruning factor. To generate branchingstructures we grow a plant with a developmental model (see Sec. 5.6)and jointly simulate its growth with all other plants in a lot.We define a number of species ( n =
10) for the whole urbanlandscape by selecting parameter values for our developmental
Fig. 7. Left: the building envelope (blue) defines a zone where plants cannotbe planted to avoid proximity to walls and blockage of door and windows.Right: plant placement without considering the building envelope. model [53]. We then use the species identifier ϕ to associate oneof the species to a seed. We further control this selection by usingthe parameter ρ , which defines the tree vs. shrub ratio in a lot. Avalue of ρ = ρ = θ .We randomly select one of the species as the dominant species in alot and use θ as a ratio to control the number of seeds associatedwith the dominant species and all other available species. A valueof θ = . e.g., along avenues orhighways. Branches that reach out of the volume are cut off. Wescale this volume by γ ; a value of γ = γ ≤ So far, lots have been treated as individual units without any mutualrelationship. However, each lot has its context that are its surround-ing roads and neighboring lots. The neighbors often share similarplanting rules that are provided by the applying municipal treeordinances [20, 43]. We want to define planting rules in a way thatwould consider the context.While context-sensitive plant seeding has not been addressed be-fore, there is a body of related work on the environmental sensitivityof individual plants that is closely related to a plant’s ability to adaptto varying conditions e.g., it may bend its branches against gravity(gravitropism) or grow towards the brightest spot (phototropism). Aplant optimizes different functions by using this plasticity. Context-sensitivity can be proceduralized as context-dependency, for exam-ple by using environmental query modules in Open L-systems [48]. rocedural Urban Forestry • 7
Fig. 8. Closeup renderings of detailed plant models. As our method relies on a environmentally-sensitive developmental model, we can produce detailedbranch geometry that adapts to buildings and other plants.Fig. 9. Variations of structural parameters. Top row: variations of age pa-rameter from young (left) to old (right). Middle row: changes of tree to shrubratio from only shrubs, to mostly trees. Bottom row: variations of speciesdiversity from a single species (left) to multiple species (right).Fig. 10. Context-sensitivity: we calibrate the parameter values of a lot withthose of adjacent lots (context). Here we show two lot configurations withregular placement strategy and variations over the parameters µ and σ . Forthe lots shown in the top row context-sensitivity is turned off and plantplacement changes abruptly from one lot to another, while for the bottomrow we show context-sensitivity across lots and the resulting calibration ofparameters (context radius: ξ = m ). Fetching the context values is a two-pass method: first, the contextis queried, then the values are interpreted by the procedural system.Inspired by this previous work, we generalize the context de-pendency to PPM. Let us recall that each PPM from Eqn. (1) has associated a placement strategy S д and two sets of parameters P p and P s . Each lot has a set of parameter values from Eqn. (2) V p and V s . Moreover, it considers the context ( i.e., the neighborhood) K of the lot that is being populated with plant positions Eqn. (2).Let us denote a particular lot of interest L and its parameter valuesas V L . In the following text, we will omit the lower index s and p ,because the parameters are calculated in the same way. The contextis the set of lots within radius ξ centered on the lot L and weightedby a 2D Gaussian. The values of the corresponding parameters (seeTab. 1) of the neighbors and the lot L are weighted according to thedistance resulting in a context-updated parameter set ˜ V L as:˜ V L = (cid:213) ∀ V K ∈K w (cid:16) d ( L , L K ) (cid:17) V K , (4)where w (cid:16) d ( L , L K ) (cid:17) is the Gaussian-weighted distance of the centerof the lot L from the center of the lot L K within the investigatedcontext and V K are the values of the parameters of the lot L K . Theupdated parameter values ˜ V L are then used for the PPM.Note that this process can be considered as a diffusion of theparameters within radius ξ . Also, to avoid a race condition whenone lot serves as a context of another one and vice versa, we calculatethe context-updated parameters ˜ V L into a different map. In thisway, the calculation does not depend on the order of the lot selectionand can be also done in parallel. Note that if we would apply Eqn. (4)multiple times, the values of the parameters would be smoothed outinto an average over the entire layout.An example in Fig. 10 shows the effect of using context sensitivityon a regular placement of trees. The first row shows two lots withregular tree placement with an abrupt change to a random place-ment in neighboring lots that is smoothed out into a semi-randomtransition when the context is used (bottom row). After generating plant positions, we jointly grow the plants in thecomputed locations of a single lot. Our developmental model isbased on the work of Palubicki et al. [53]; a tree is a modular system(leaves, buds, stems, and internodes). An internode is a plant stembetween two or more leaves and a tree is composed of a successionof internodes.The primary plant development is controlled by the expansionof buds that are either apical (terminal) or lateral (axial). Branchesexpand at their tips by expanding their apical buds or on sides bygrowing lateral buds. Buds use signaling by the growth hormoneAuxin to prevent overgrowth and to control apical dominance [28]. • Niese, T. et al.
Fig. 11. Top row: a tree grown in different environmental conditions. Fromleft to right: alone, together with another tree, and close to a set of buildings.Bottom row: the growth response of a group of trees in an urban environmentgenerates complex and unique branching structures.Fig. 12. Pruning of branches allows for the adjustment and organizationof tree form. Here trees along a street are severely pruned to form a hedge( γ = . ). Secondary plant development (cambial growth) is the thickening ofa tree trunk and branches [32] simulated by expanding their radiiusing da Vinci’s rule (see [44] for a discussion).Trees compete for space by seeking light (phototropism) andavoiding collisions and overcrowding. Many different algorithmshave been implemented to capture plant competition for resources(see [48, 63] and [55] for an overview). We use the approach of [63]later extended by [53] that uses space occupation by randomlyscattered particles that attract growing branches. We also simulatephototropism by computing the illumination of buds and bendingthe growth direction towards the brightest spot visible from a bud.Apical control and branching parameters are simulated by usingthe growth model from [66] with the set of parameters.
Learning plant positions directly from image data is a challengingproblem that cannot be easily addressed by existing neural net-work architectures or other methods. To obtain plant positions inan end-to-end manner, a network would have to either output avariable number of plant positions, or operate on a fixed size domain,such as an image. The latter requires to obtain plant positions as apost-processing step, which is error-prone. Furthermore, generatingground truth data pairs of satellite images and plant positions ( e.g.,
GPS coordinates) for training a neural network is challenging (seeSect.8.3 for a discussion). Moreover, an end-to-end deep learning-based system would sacrifice the in-depth understanding of theunderlying mechanisms and would not allow for low-level controlthat is needed in interactive editing. Therefore, to recover the placement and appearance of real urbanlandscapes we aim to learn the distribution of plants in the parame-ter space of our positional parameters. This has the advantage thatour above-defined PPMs act as a prior, which helps to regularize thetraining of our network and in turn to generate plausible plant posi-tions. Furthermore, learning the parameters of a procedural modelmaps images to a set of comprehensible and intuitive parameters,which provides an efficient way to further edit plant placements.
We use a two-stage neural network pipeline to learn the param-eters of our PPMs: first, we translate satellite images to semanticmaps that describe vegetation coverage (Fig. 14, a-c). Second, welearn the positional parameters from coverage maps with a light-weight convolutional neural network (Fig. 14, d, e). This pipelinehas the advantage that we do not need to rely on pairs of satellite im-ages and positional parameters for training, but instead on pairs ofcoverage maps and positional parameters, which can be generatedsynthetically with our PPMs.To translate satellite images to coverage maps we used a style-transfer deep neural network [27]. A coverage map is a flat-coloredimage where every pixel color is based on whether the correspond-ing pixel in a satellite image represents vegetation. Coverage mapshave less complex visual traits and are similar for real and syntheticdata. Therefore, the network is able to learn this transfer. We usedpairs of satellite images and coverage maps that are publicly avail-able for some cities [50] to train the style-transfer network to learncoverage maps from satellite images. This allows us to obtain cover-age maps of cities for which coverage data does not exist. Fig. 21(Appx. B) shows examples of training data and generated coveragemaps.We then train a neural network to obtain positional parametervalues ( µ , σ , τ , β , κ , π , see Tab. 1) from the coverage maps. Trainingis done on synthetically generated pairs of coverage maps and posi-tional parameters obtained from our PPMs. Specifically, we definethe generated coverage maps as q ∈ Q for which we know thecorresponding positional parameters P p ∈ U . The network canthus be defined as f ( q ) : Q → U . To summarize: stage one of our pipeline allows us to learn coveragemaps from satellite images, which – in stage two – allows us toobtain the positional parameters of our PPMs. Together this enablesus to generate positions of vegetation for individual lots with similarcharacteristics as observed in the satellite imagery ( e.g., plant dis-tance, density, etc.). Once the parameters are generated, we stencilthe coverage map with the geometry of each lot and identify areaswhere we need to place vegetation for a reconstruction. We convertthe regions into polygons and then use our PPMs to generate plantpositions within the identified areas of a lot (Fig. 13). As a coveragemap defines the areas where vegetation should be placed withina lot, it replaces the placement strategies introduced in Sec. 5.2 –the areas defined by the coverage map are then populated witha random strategy along with the learned positional parametersdefined in Tab. 1. rocedural Urban Forestry • 9
Fig. 13. Vegetation placement based on real data: we use vegetation cover-age maps (middle) to identify active regions for individual lots and populatethem with our PPMs. This allows us to generate plant distributions (right)similar to what can be observed in satellite images (left).Fig. 14. Neural network pipeline: we use a style-transfer network (b) trainedon data pairs from NYCOpenData [50] to convert satellite images (a) tocoverage maps (c). To learn parameter values for our PPMs (for which noground truth data for satellite images exist) we generate pairs of coveragemaps and parameter values with the PPMs of our framework. We then traina CNN (d) to obtain parameter values (e) for the estimated coverage mapsof the real satellite images.
For training the Pix2Pix style-transfer network we rely on the pub-licly available implementation of the original model implementedin Python. We train the network on 20K pairs of satellite imagesand coverage maps provided by the NYC Open Data [50]. We usethe default hyperparameter settings for Pix2Pix [27]; the networkconverged after training for 200 epochs. We then use the networkto convert satellite images of urban landscapes to coverage maps.The geometry of single lots is also obtained from the NYC OpenData. Our urban modeling framework operates on longitudinal andlatitudinal coordinates, which allows us to register satellite images,lot data, and coverage maps, which in turn enables us to rendersatellite images and publicly available maps ( e.g.,
Open Street Maps)in the same framework. Our regression CNN consists of five con-volutional layers (32 units) followed by two dense layers (64 units)with relu activations for all, expect the last layer. We use our PPMsto synthetically generate 21K pairs of (coverage map, positionalparameter)-pairs to train the network. To regress the positionalparameters we use mean squared error as loss function and are ableto achieve 95% accuracy for predicting the parameters. We use a80% – 20% split for training and testing data. All results shown inthe paper are generated from validation data.
Our interactive framework for modeling and rendering urban land-scapes was implemented in
C++ and OpenGL. All results have beengenerated on an Intel(R) Core i7-7700K, 8x4.2GHz with 32GB RAM,and an NVIDIA Geforce RTX 2080 GPU (12 GB RAM). The most demanding online task is the generation of tree geome-try. We simplify this by representing trees by their skeletons thatare generated on the CPU. We further offload the mesh generationof the branch surfaces into a geometry shader on the GPU. Similarly,leaves are generated as textured quads that are also generated on thefly. Buildings and other structures are rendered as extruded outlines.While we cannot render large plant populations in real-time, ourframework allows us to interactively explore placement strategiesand parameter settings. To render large scenes ( e.g.,
Fig. 20) weuse a simple level-of-detail scheme that successively replaces treegeometry with billboards and point primitives based on the distanceto the camera. Appx. A (Tab. 3) shows parameter values for mostfigures shown in the paper.
We have shown that PPMs can be used to automatically place vege-tation into urban landscapes based on the lot data. The geometryof individual lots can either be obtained from publicly availabledatasets, or –for synthetically generated urban layouts– as part ofthe modeling process.However, PPMs operate on a polygon and they were designedwith interactive authoring in mind. The user can simply use a brushtool to draw an area on a map. We then convert the sketch to apolygon and assign a PPM. Depending on its placement strategy,the PPM will then generate plant positions according to the geome-try of the polygon and its associated placement strategy (Fig. 15).Furthermore, a user can directly draw the vegetation coverage forindividual lots or polygons. Similar as to learning the coverage mapsfrom satellite images, sketching a coverage map replaces the place-ment strategy for a lot and the PMM then places plants based on thepositional and structural parameters, which provides a convenientway for more nuanced vegetation placement.This process also allows us to generate even more diverse zonesif necessary. For example, it is possible to define individual zonesfor back and front yards, the vegetation along streets, or even parks.A key idea of our approach is to factorize the complexity of defin-ing a complex procedural model into more manageable placementstrategies. A PPM only works on a single polygon and generatesplant positions for this geometry. This way it is easy to extend ourapproach by new placement strategies.
Figs. 1, 18, and 20 show perspective and top-down renderings ofurban landscapes along with the vegetation generated by our frame-work. For these results we used coverage maps to reproduce vegeta-tion placement similar to the real scenes. Fig. 17 shows results wherewe only used our procedural model, without additional coveragemaps. For both cases, the produced plant populations show charac-teristic visual traits as found in real distributions of vegetation atcity-scale. Based on our placement strategies – in combination withthe positional and structural parameters – we can generate complexpatterns of urban vegetation.Moreover, we show vegetation placements for the different mu-nicipality zones (residential, park, commercial) in Fig. 17. Positionalparameters allow us to generate planting patterns as commonly
Fig. 15. A user can interactively sketch placement zones with a brushtool (left). Each placement zone is converted to a polygon and assigneda placement strategy to grow plants (right). Here we show the strategiesmedial axis (blue), single (yellow), and random (red).Fig. 16. The placement of vegetation changes with the size of the activeareas within a lot. While the used cluster strategy initially generates plantsin the entire lot, transitioning to less available space due to a larger building(white) generates more organized plant positions at the boundary of the lot. found in these areas, while we can also produce structural varia-tions by selecting the number of species, their height, and theirage (Fig. 9). Additionally, we can control the pruning of plants togenerate more organized plant shapes (Fig. 12). In an urban settingbuildings often shade larger areas. Trees growing in these regionsstrive to grow out of the shadow toward the light. This interactionof a tree with other trees as well as close-by buildings generatescomplex and unique branching structures. Figs. 11 and 8 show themodeling result of trees grown in varying environmental conditions.Finally, Figs. 15 and 16 show the capabilities of our framework forthe interactive authoring of urban landscapes. In Fig. 15 a user drewregions for vegetation onto the ground of an urban layout; eachbrush tool was assigned a different placement strategy and set ofparameters values. Our method then converted the sketched areasto polygons and applied different PPMs. In Fig. 16 we show howthe placement of vegetation changes, when the size of a buildingon a lot increases. While with a small building there is more spacefor random plant configurations, the placement transitions to moreorganized plant positions when the size of the building increases.
To validate point distributions generated with our placement mod-els we ran a user study to evaluate the perceived realism of plantdistributions that are generated with our PPMs and real data. Fur-thermore, we measured the distance of generated and ground truthpoint sets of plant positions.
We generated two sets of images for the user study, one with treesplaced by our PPMs and the other based on real data. However, val-idation against satellite images is difficult, because it is not possibleto fully generate the same visual complexity. To avoid this bias, werendered all images by using our framework (examples are shown in Fig. 18). We identified 28 lots with varying plant placement andproduced plant positions by using all placement strategies for theselots and rendered them as top-down images with our framework.We generated the real data by manually identifying plants in satel-lite images in these lots and marked their positions. We then loadedthese positions into our framework and rendered them in the samerendering style for both categories to avoid bias. Furthermore, wechose top-down views for evaluating placement strategies, as thisallows us to evaluate the respective distributions of plants.We then performed a two-alternative force check (2AFC) on theimages. We generated pairs of images one being generated from thereal data and one from each category of procedural data, leadingto the total 28 × =
168 pairs of images. We randomly shuffledtheir placement (left-right) and their order. The image pairs wereshown to 33 users from Mechanical Turk (MT) and we made surethat only MT masters (reliable users) were answering the study. Wethen asked the users "Which plant distribution looks more realistic(left or right)?" and the user had to choose one image. Each PPMcategory and real data received multiple rankings from every user.Our tests indicate that the PPMs provide results that are percep-tually consistent with real-world tree placement. The users selectedreal as more realistic in 58% and the procedural placement in 42%.
To quantitatively evaluate the generated point distributions wemeasure the Chamfer distance (CD) (e.g. as used in [18]) betweenmanually labeled plant positions and the procedurally generatedones. For each point in a point set CD finds the nearest point inthe other point set and computes the sum of squared distances. Adistance of 0 would indicate that the two point sets are the same.However, we are not aiming to generate the exact same point set, butone with a similar distribution of positions. We used the manuallylabeled plant positions of 28 lots (Sect. 8.1) and generated randompoint sets as a baseline. We observe an average distance of 0.16between manually labeled points (ground truth) and procedurallygenerated point sets that are supposed to show a similar distributionof positions, and a distance of 0.27 between the ground truth andrandom point distributions.While only comparing point sets is not a conclusive metric forevaluating the similarity of procedurally generated and real vegeta-tion, it shows that our model produces plant distributions that arecloser to the ground truth positions compared to random positions.
Our framework allows us to place and simulate vegetation in urbanlandscapes. To this end, our focus was on generating convincingdistributions of plants for synthetic and real city models. Becausedefining rules for all possible variations of plants in urban landscapesis intractable, we factorized the problem of placing plants into anumber of placement strategies. Each strategy provides a conciseset of rules along with parameters to describe the positional andstructural properties of vegetation within individual lots. Together,placement strategies and parameters allow us to generate realisticdistributions of plants within the functional zones of an urban layout. rocedural Urban Forestry • 11
Fig. 17. Top-down renderings of plant distributions for three municipality zones generated with different placement strategies. Top row: the placementstrategies boundary, random, cluster, and regular for a residential lot of buildings. Middle row: the placement strategies boundary, cluster, and regular for thelot of a public park. Bottom row: the placement strategies cluster, boundary for a commercial lot (left) and the placement of trees with medial axis alongstreets with equidistant spacing set to: δ = m (right).Fig. 18. Given a satellite image (left), our method is able to generate similar plant populations (right) as what can be observed in the real scene. To comparethe results of our procedural model we manually labeled plants and used their longitude and latitude coordinates to render them at their real positions in ourframework (middle). This allows us to evaluate the visual quality of synthetically generated plant positions compared to real plant distributions. Fig. 19. Left: Google maps view of New York (Central Park). Our framework generated two variations of plant placements (middle, right) for an initially emptycity model. Middle: 54k plant positions were generated in about 60 seconds.Fig. 20. Our framework enables to efficiently place vegetation for large urban areas.
In addition, we use a state-of-the-art developmental model for plantsto simulate their environmental response.We generate distributions of vegetation that resemble what canbe observed in satellite imagery; our focus was not on preciselyreconstructing every plant of a real environment. While this is ar-guably important, it would require further analysis ( e.g., throughdeep learning) of satellite images and additional data sources, suchas coverage maps. To this end we think that procedurally generatedvegetation can help to generate training data for more advancedanalysis pipelines. Compared to manually placing vegetation, ourmethod provides more control and capabilities for the efficient au-thoring of vegetation placement for city models.As an alternative to learning parameters with the neural networkpipeline illustrated in Fig. 14, we experimented with learning plantpositions with Pix2Pix [27] in an end-to-end manner. For this setupwe used satellite images as input and images with plant positionsand building geometry as an output. The goal was to obtain theplant positions from the images in a post-processing step. Trainingthis network was not successful due to two reasons: for one, it is dif-ficult to obtain ground truth data pairs of satellite images and plantpositions. While some datasets contain GPS positions of trees, theyonly store these positions for trees along streets, which is not usefulfor learning plant positions of an entire city. Second, the results ofthe network produced were not satisfactory. We suspect that theground truth images were to sparse ( i.e., too few tree positions andbuilding geometry) to provide a meaningful training signal. A limitation of our current implementation is that we are not ableto obtain structural parameters with our learning pipeline. Struc-tural parameters cannot be learned from coverage maps; learningthem from top-down satellite images was not successful. Anotherlimitation of our current approach is that we focus on medium andlarge trees and do not place smaller plants, such as flowers, bushes,or grass. While fixed models of flowers could be placed with ourplacement strategies (for example by using agent-based models [4]),there exists no integrated developmental model that would allowus to jointly develop trees and flowers. Therefore, we decided toonly simulate the growth response of trees to their environment.Furthermore, we do not model plants that are shaped through ad-vanced topiary. More research would be required to explore howpruning affects growth, e.g., for hedges.
We have presented a novel framework for populating synthetic andreal urban landscapes with vegetation. To this end we introducedprocedural placement models that allow us to realistically generateplant positions and to jointly grow individual plants into individuallots. The key idea to our approach is that complex patterns of vege-tation among different zoning types of a city can be factorized intoa set of simple placement rules. A PPM implements these rules and– together with their parameterization – allows to generate complexpatterns of vegetation with high visual fidelity. Moreover, the PPMsare context-sensitive and read the immediate neighborhood whichallows us to smooth out abrupt changes in placement. rocedural Urban Forestry • 13
To populate vegetation into models of real cities we have used astate-of-the-art style-transfer network to translate satellite imagesto coverage maps of vegetation. These coverage maps allow us todetermine the distribution of vegetation within individual lots ofa city, which in turn allows us to reconstruct vegetation similarto what can be observed in real data. Instead of reconstructingvegetation at city scale precisely – which is intractable – our goalis to generate convincing and plausible details for reconstructingexisting cities or for populating entirely new virtual cities withvegetation.We see a number of avenues for future work. First, it would beinteresting to further explore physical functions in an urban contextthat are affected by vegetation such as heat transfer, shading, wind,and sound barriers. Second, further exploring how neural networkscan help to generalize to more diverse urban data and to use themto learn parameters for scene generation seems like a promisingdirection for future research. Finally, we want to explore enhancedplacement strategies to capture more of the variation of vegetationplacements that can be observed in real cities.
REFERENCES [1] S. AlHalawani, Y.-L. Yang, H. Liu, and N. J. Mitra. 2013. Interactive FacadesAnalysis and Synthesis of Semi-Regular Facades.
CGF
32 (2013), 215–224.[2] M. Aono and T.L. Kunii. 1984. Botanical Tree Image Generation.
IEEE Comput.Graph. Appl.
ACM Trans. on Graphics
32, 1, Article 8 (2013), 13 pages.[4] B. Benes, J. A. Cordóba, and J. M. Soto. 2003. Modeling Virtual Gardens byAutonomous Procedural Agents. In
Proceedings of TPCG . IEEE Computer Society,58.[5] B. Benes, M. A. Massih, P. Jarvis, D. G. Aliaga, and C. A. Vanegas. 2011. UrbanEcosystem Design. In
I3D . 167–174.[6] B. Benes and E. U. Millán. 2002. Virtual Climbing Plants Competing for Space. In
CA ’02: Proceedings of the Computer Animation . IEEE Computer Society, 33.[7] D.V. Bloniarz and H.D.P. Ryan. 1993. Designing alternatives to avoid street treeconflicts.
Journal of Arboriculture
19 (1993), 152–152.[8] M. Bokeloh, M. Wand, and H.-P. Seidel. 2010. A connection between partialsymmetry and inverse procedural modeling. In
ACM Trans. on Graphics , Vol. 29.ACM, 104.[9] G. Chen, G. Esch, P. Wonka, P. Müller, and E. Zhang. 2008. Interactive ProceduralStreet Modeling. In
ACM SIGGRAPH 2008 . 10.[10] H. I. Choi, S. W. Choi, H. P. Moon, and N.-S. Wee. 1997. New Algorithm for MedialAxis Transform of Plane Domain.
CVGIP: Graphical Model and Image Processing
59 (1997), 463–483.[11] G. Cordonnier, E. Galin, J. Gain, B. Benes, E. Guérin, A. Peytavie, and M.-P. Cani.2017. Authoring landscapes by combining ecosystem and terrain erosion simula-tion.
ACM Trans. on Graphics
36, 4 (2017), 134.[12] M. Dang, D. Ceylan, B. Neubert, and M. Pauly. 2014. SAFE: Structure-awareFacade Editing.
CGF (2014).[13] M. Dang, S. Lienhard, D. Ceylan, B. Neubert, P. Wonka, and M. Pauly. 2015.Interactive Design of Probability Density Functions for Shape Grammars.
ACMTrans. on Graphics
34, 6, Article 206 (2015), 13 pages.[14] P. de Reffye, C. Edelin, J. Françon, M. Jaeger, and C. Puech. 1988. Plant ModelsFaithful to Botanical Structure and Development.
SIGGRAPH Comput. Graph.
Proc. of Sigg.(SIGGRAPH ’98) . ACM, 275–286.[16] A. Emilien, U. Vimont, M.-P. Cani, P. Poulin, and B. Benes. 2015. WorldBrush:Interactive Example-Based Synthesis of Procedural Virtual Worlds.
ACM Trans.on Graphics
34, 4, Article Article 106 (2015), 11 pages.[17] Theodore A Endreny. 2018. Strategically growing the urban forest will improveour world.
Nature communications
9, 1 (2018), 1160.[18] H. Fan, H. Su, and L. Guibas. 2017. A Point Set Generation Network for 3D ObjectReconstruction from a Single Image. In . 2463–2471.[19] J. Gain, H. Long, G. Cordonnier, and M-P Cani. 2017. EcoBrush: Interactive Controlof Visually Consistent Large-Scale Ecosystems. In
CGF , Vol. 36. 63–73. [20] Gene W Grey. 1995.
The urban forest: Comprehensive management . John Wiley &Sons.[21] É. Guérin, J. Digne, É. Galin, A. Peytavie, C. Wolf, B. Benes, and B. Martinez.2017. Interactive Example-based Terrain Authoring with Conditional GenerativeAdversarial Networks.
ACM Trans. on Graphics
36, 6, Article 228 (2017), 13 pages.[22] Jianwei Guo, Haiyong Jiang, Bedrich benes, Oliver Deussen, Dani Lischinski,and Hui Huang. 2020. Inverse Procedural Modeling of Branching Structures byInferring L-Systems.
ACM Trans. Graph. (2020), 1.[23] T. Hädrich, B. Benes, O. Deussen, and S. Pirk. 2017. Interactive Modeling andAuthoring of Climbing Plants.
Comput. Graph. Forum
36, 2 (2017), 49–61.[24] Y. Hu, J. Dorsey, and H. Rushmeier. 2019. A Novel Framework for Inverse Proce-dural Texture Modeling.
ACM Trans. Graph.
38, 6, Article Article 186 (Nov. 2019),14 pages.[25] T. Ijiri, S. Owada, and T. Igarashi. 2006. Seamless Integration of Initial Sketchingand Subsequent Detail Editing in Flower Modeling.
CGF
25, 3 (2006), 617–624.[26] M. Ilčík, P. Musialski, T. Auzinger, and M. Wimmer. 2015. Layer-Based ProceduralDesign of Façades.
CGF
34, 2 (2015), 205–216.[27] P. Isola, J.-Y. Zhu, T. Zhou, and A. A. Efros. 2016. Image-to-Image Translationwith Conditional Adversarial Networks.
CVPR (2016), 5967–5976.[28] T. H. Kebrom. 2017. A Growing Stem Inhibits Bud Outgrowth – The OverlookedTheory of Apical Dominance.
Frontiers in Plant Science
ACM Trans. Graph.
36, 6, Article Article 204 (Nov. 2017),16 pages.[30] T. Kelly, P. Guerrero, A. Steed, P. Wonka, and N. J. Mitra. 2018. FrankenGAN:Guided Detail Synthesis for Building Mass Models Using Style-Synchonized GANs.
ACM Trans. Graph.
37, 6 (2018), 1:1–1:14.[31] A. Khan, A. Sohail, U. Zahoora, and A. Saeed. 2019. A Survey of the RecentArchitectures of Deep Convolutional Neural Networks.[32] J. Kratt, Mark Spicker, A. Guayaquil, M. Fišer, S. Pirk, O. Deussen, J. C. Hart, andB. Benes. 2015. Woodification: User-Controlled Cambial Growth Modeling.
CGF
34, 2 (2015), 361–372.[33] C. Li, O. Deussen, Y.-Z. Song, P. Willis, and P. Hall. 2011. Modeling and GeneratingMoving Trees from Video.
ACM Trans. on Graphics
30, 6, Article 127 (2011), 127:1–127:12 pages.[34] Y. Li, Q. Zheng, A. Sharf, D. Cohen-Or, B. Chen, and N. J. Mitra. 2011. 2D-3Dfusion for layer decomposition of urban facades. In
ICCV . 882–889.[35] A. Lindenmayer. 1968. Mathematical models for cellular interaction in develop-ment.
Journal of Theoretical Biology
Parts I and II, 18 (1968), 280–315.[36] B. Lintermann and O. Deussen. 1999. Interactive Modeling of Plants.
IEEE Comput.Graph. Appl.
19, 1 (1999), 56–65.[37] Y. Livny, S. Pirk, Z. Cheng, F. Yan, O. Deussen, D. Cohen-Or, and B. Chen. 2011.Texture-lobes for Tree Modelling.
ACM Trans. on Graphics
30, 4, Article 53 (2011),10 pages.[38] S. Longay, A. Runions, F. Boudon, and P. Prusinkiewicz. 2012. TreeSketch: inter-active procedural modeling of trees on a tablet. In
Proc. of the Intl. Symp. on SBIM .107–120.[39] M. Makowski, T. Hädrich, J. Scheffczyk, D. L. Michels, S. Pirk, and W. Palubicki.2019. Synthetic Silviculture: Multi-scale Modeling of Plant Ecosystems.
ACMTrans. on Graphics
38, 4, Article 131 (2019), 14 pages.[40] A Martinovic and L Van Gool. 2013. Bayesian grammar learning for inverseprocedural modeling. In
CVPR . 201–208.[41] P. Merrell and D. Manocha. 2008. Continuous Model Synthesis.
ACM Trans. onGraphics
27, 5, Article Article 158 (2008), 7 pages.[42] P. Merrell and D. Manocha. 2011. Model Synthesis: A General Procedural ModelingAlgorithm.
TVCG
17, 6 (2011), 715–728.[43] Robert W Miller, Richard J Hauer, and Les P Werner. 2015.
Urban forestry: planningand managing urban greenspaces . Waveland press.[44] R. Minamino and M. Tateno. 2014. Tree branching: Leonardo da Vinci’s ruleversus biomechanical models.
PloS one
9, 4 (2014), e93535.[45] S. Mitchell, A. Rand, M. Ebeida, and C. Bajaj. 2012. Variable Radii Poisson-DiskSampling.
CCCG (01 2012).[46] P. Müller, P. Wonka, S. Haegler, A. Ulmer, and L. Van Gool. 2006. Proceduralmodeling of buildings.
ACM Trans. on Graphics
25, 3 (July 2006), 614–623.[47] P. Müller, G. Zeng, P. Wonka, and L. Van Gool. 2007. Image-based ProceduralModeling of Facades.
ACM Trans. on Graphics
26, 3, Article 85 (2007).[48] R. Měch and P. Prusinkiewicz. 1996. Visual models of plants interacting with theirenvironment. In
ACM SIGGRAPH 96 . ACM, New York, NY, USA, 397–410.[49] G. Nishida, I. Garcia-Dorado, D. G. Aliaga, B. Benes, and A. Bousseau. 2016.Interactive sketching of urban procedural models.
ACM Trans. on Graphics
35, 4(2016), 130.[50] NYCOpenData. 2019. The Next Decade of Open Data. (2019). https://data.ny.gov/[51] M. Okabe, S. Owada, and T. Igarashi. 2007. Interactive Design of Botanical TreesUsing Freehand Sketches and Example-based Editing. In
ACM SIGGRAPH Courses .ACM, Article 26. [52] P. E. Oppenheimer. 1986. Real time design and animation of fractal plants andtrees.
Proc. of SIGGRAPH
20, 4 (1986), 55–64.[53] W. Palubicki, K. Horel, S. Longay, A. Runions, B. Lane, R. Měch, and P.Prusinkiewicz. 2009. Self-organizing Tree Models for Image Synthesis.
ACMTrans. on Graphics
28, 3, Article 58 (2009), 10 pages.[54] Y. I. H. Parish and P. Müller. 2001. Procedural Modeling of Cities (ACM SIGGRAPH2001) . 301–308.[55] S. Pirk, B. Benes, T. Ijiri, Y. Li, O. Deussen, B. Chen, and R. Měch. 2016. ModelingPlant Life in Computer Graphics. In
ACM SIGGRAPH 2016 Courses . Article Article18, 180 pages.[56] S. Pirk, M. Jarząbek, T. Hädrich, D. L. Michels, and W. Palubicki. 2017. InteractiveWood Combustion for Botanical Tree Models.
ACM Trans. on Graphics
36, 6,Article 197 (2017), 12 pages.[57] S. Pirk, T. Niese, O. Deussen, and B. Neubert. 2012. Capturing and animating themorphogenesis of polygonal tree models.
ACM Trans. on Graphics
31, 6, Article169 (2012), 10 pages.[58] S. Pirk, T. Niese, T. Hädrich, B. Benes, and O. Deussen. 2014. Windy Trees:Computing Stress Response for Developmental Tree Models.
ACM Trans. onGraphics
33, 6, Article 204 (2014), 11 pages.[59] S. Pirk, O. Stava, J. Kratt, M. A. M. Said, B. Neubert, R. Měch, B. Benes, and O.Deussen. 2012. Plastic trees: interactive self-adapting botanical tree models.
ACMTrans. on Graphics
31, 4, Article 50 (2012), 10 pages.[60] P. Prusinkiewicz. 1986. Graphical Applications of L-systems. In
Proceedingson Graphics Interface ’86/Vision Interface ’86 . Canadian Information ProcessingSociety, 247–253.[61] D. Ritchie, B. Mildenhall, N. D. Goodman, and P. Hanrahan. 2015. Controllingprocedural modeling programs with stochastically-ordered sequential MonteCarlo.
ACM Trans. on Graphics
34, 4 (2015), 105.[62] D. Ritchie, K. Wang, and Y.-A. Lin. 2018. Fast and Flexible Indoor Scene Synthesisvia Deep Convolutional Generative Models. In
CVPR .[63] A. Runions, B. Lane, and P. Prusinkiewicz. 2007. Modeling Trees with a SpaceColonization Algorithm. In
Conference on Natural Phenomena (NPH–07) . 63–70.[64] M. Schwarz and P. Müller. 2015. Advanced Procedural Modeling of Architecture.
ACM Trans. on Graphics
34, 4 (Proceedings of SIGGRAPH) (2015), 107:1–107:12.[65] R. M. Smelik, T. Tutenel, R. Bidarra, and B. Benes. 2014. A Survey on ProceduralModelling for Virtual Worlds.
CGF
33, 6 (2014), 31–50.[66] O. Stava, B. Benes, R. Měch, D. G. Aliaga, and P. Kristof. 2010. Inverse ProceduralModeling by Automatic Generation of L-systems.
CGF
29, 2 (2010), 665–674.[67] O. Stava, S. Pirk, J. Kratt, B. Chen, R. Mech, O. Deussen, and B. Benes. 2014. InverseProcedural Modelling of Trees.
CGF
33, 6 (2014), 118–131.[68] J. O. Talton, Y. Lou, S. Lesser, J. Duke, R. Měch, and V. Koltun. 2011. Metropolisprocedural modeling.
ACM Trans. on Graphics
30, 2 (2011), 11.[69] P. Tan, T. Fang, J. Xiao, P. Zhao, and L. Quan. 2008. Single Image Tree Modeling.
ACM Trans. on Graphics
27, 5, Article 108 (2008), 7 pages.[70] P. Tan, G. Zeng, J. Wang, S. B. Kang, and L. Quan. 2007. Image-based Tree Modeling.
ACM Trans. on Graphics
26, 3, Article 87 (2007).[71] C. A. Vanegas, D. G. Aliaga, and B. Benes. 2010. Building reconstruction usingmanhattan-world grammars.
CVPR
ACM SIGGRAPHAsia . Article Article 111, 10 pages.[73] C. A. Vanegas, D. G. Aliaga, P. Wonka, P. Müller, P. Waddell, and B. Watson. 2010.Modelling the Appearance and Behaviour of Urban Spaces.
CGF
29, 1 (2010),25–42.[74] C. A. Vanegas, I. Garcia-Dorado, D. G. Aliaga, B. Benes, and P. Waddell. 2012.Inverse Design of Urban Procedural Models.
ACM Trans. on Graphics
31, 6, Article168 (Nov. 2012), 11 pages.[75] P. Waddell. 2002. UrbanSim: Modeling urban development for land use, transporta-tion, and environmental planning.
Journal of the American planning association
68, 3 (2002), 297–314.[76] P. Waddell, G. F. Ulfarsson, J. P. Franklin, and J. Lobb. 2007. Incorporating land usein metropolitan transportation planning.
Transportation Research Part A: Policyand Practice
41, 5 (2007), 382–410.[77] B. Wang, Y. Zhao, and J. Barbič. 2017. Botanical Materials Based on Biomechanics.
ACM Trans. on Graphics
36, 4, Article 135 (2017), 13 pages.[78] K. Wang, Y.-A. Lin, B. Weissmann, M. Savva, A. X. Chang, and D. Ritchie. 2019.PlanIT: Planning and Instantiating Indoor Scenes with Relation Graph and SpatialPrior Networks.
ACM Trans. on Graphics
38, 4, Article Article 132 (2019), 15 pages.[79] B. Watson, P. Müller, O. Veryovka, A. Fuller, P. Wonka, and C. Sexton. 2008.Procedural Urban Modeling in Practice.
IEEE Computer Graphics and Applications
28, 3 (2008), 18–26.[80] B. Weber, P. Müller, P. Wonka, and M. Gross. 2009. Interactive Geometric Simula-tion of 4D Cities.
CGF
28, 2 (2009), 481–492.[81] J. Wither, F. Boudon, M.-P. Cani, and C. Godin. 2009. Structure from silhouettes:a new paradigm for fast sketch-based design of trees.
CGF
28, 2 (2009), 541–550. [82] P. Wonka, M. Wimmer, F. Sillion, and W. Ribarsky. 2003. Instant Architecture.
ACM Trans. on Graphics
22, 3 (2003), 669–677.[83] F. Wu, D.-M. Yan, W. Dong, X. Zhang, and P. Wonka. 2014. Inverse ProceduralModeling of Facade Layouts.
ACM Trans. on Graphics
33, 4, Article 121 (2014),10 pages.[84] X. Wu, K. Xu, and P. Hall. 2017. A survey of image synthesis and editing withgenerative adversarial networks.
Tsinghua Science and Technology
22, 6 (2017),660–674.[85] K. Xie, F. Yan, A. Sharf, D. Deussen, H. Huang, and B. Chen. 2016. Tree Modelingwith Real Tree-Parts Examples.
TVCG
22, 12 (2016), 2608–2618.[86] Y.-T. Yeh, K. Breeden, L. Yang, M. Fisher, and P. Hanrahan. 2013. Synthesis ofTiled Patterns Using Factor Graphs.
ACM Trans. on Graphics
32, 1, Article 3 (2013),13 pages.[87] H. Zeng, J. Wu, and Y. Furukawa. 2018. Neural Procedural Reconstruction forResidential Buildings. In
Computer Vision – ECCV 2018 , V. Ferrari, Martial Hebert,C. Sminchisescu, and Y. Weiss (Eds.). 759–775.
A PARAMETERS
Tab. 3 lists the parameter values we used to generate the figures in the paper.
Table 3. Parameter values for figures in the paper.
Fig.
Strategy µ σ τ β κ π ω ψ η α ρ θ λ
B SATELLITE AND COVERAGE MAP DATA