How to manipulate landscapes to improve the potential for range expansion

Climate change is a global threat to species, and their capacity to adapt could be limited by habitat fragmentation. Many initiatives to restore habitats, increase connectivity and/or ensure ‘functioning ecological networks’ are explicitly or implicitly trying to address this threat. However, existing methods of analysing networks mainly treat the landscape as static, and it is difficult to use these to plan restoration. We use a recent method to approximate the speed of a species’ range expansion through a landscape by an analogy to an electrical circuit, which takes into account both the rates of colonisation between patches and the rates at which occupied habitat produces new emigrants. Based on this, we propose and test two methods that can help to optimise the spatial arrangement of habitat for range expansion. First, high current flowing through a habitat patch indicates that it should be a priority for conservation, and this can be the basis of an algorithm for iteratively dropping the least valuable patches. Secondly, high power in a link between two patches indicates that it is a bottleneck in the circuit, and this can be the basis of an algorithm for iteratively adding new patches in the most efficient places. We show that these methods perform well for a variety of realistic landscape patterns, assuming known and fixed dispersal ability and source/target locations. The calculations involved for each parameter set are fast enough to be used as building blocks in a larger optimisation for practical planning of landscapes for multiple species. Thus, we lay the foundation for a new genre of systematic conservation planning, which efficiently proposes restoration as well as minimising loss.


Introduction
Climate change is a growing threat to species world-wide, and it interacts with the older and more obvious threats of habitat loss and fragmentation (IPCC 2007;Brook, Sodhi & Bradshaw 2008). Species may be able to adapt to climate change by shifting their geographic ranges, but to do so they need sufficient habitat in their existing range, their future range and any intermediate areas to enable population survival and colonisation (Hill, Thomas & Huntley 1999;Hannah et al. 2007;Hodgson et al. 2009). This means that the arrangement of habitat as functionally connected networks is more important than ever for conservation, and connections potentially have to be made over very long distances (Huntley et al. 2008;Phillips et al. 2008). Landscapes in developed countries are probably not functionally connected from the point of view of many species, and many conservation organisations are trying to address this by habitat restoration. Computational decision support tools can potentially be used to find the best places for restorationthose that are likely to lead to a large increase in conservation success with a small area of land. The tools that are currently available fall in two broad classesecological network mapping tools and reserve selection toolsbut both have drawbacks, particularly when it comes to accounting for climate change.
There are several tools which can illustrate the functioning of existing habitat networks (most often used to illustrate the strength of connections, but also the existence of clusters and various metrics of 'betweenness'; e.g. Pascual-Hortal & Saura 2006;Urban et al. 2009;Carroll, McRae & Brookes 2012). These illustrations, when interpreted intelligently, can give valuable clues about conservation priorities or restoration possibilities, but there is no certainty that the current state of the network is a reliable indicator of the cost or benefit of making changes to the network in particular places. Furthermore, the ability of a network to support a metapopulation or a diffuse population at equilibrium is different from the ability of a network to support wholesale redistribution of genes and species under climate change (Hodgson et al. 2012). Network methods implicitly characterise the landscape from the perspective of a single species, although people often use a small set of parameters to act as a proxy for a larger set of species that depend on a particular habitat, or on all 'natural' habitats (Krosby et al. 2015).
Separately, there are several general approaches to spatial conservation prioritisation, which aim to find an optimal set of nature reserves given certain constraints (Moilanen, Wilson & Possingham 2009). These methods are highly sophisticated in their ability to consider multiple species, habitats and other valued features, as well as multiple sources of cost. However, it proves challenging to incorporate connectivity between sites into such methods in anything but a simplistic way. If one can disregard climate change and assume that the land currently covered with high-quality habitat is most likely to retain highquality habitat in future, then simple connectivity metrics are probably adequate, but this assumption is obviously risky (Early & Sax 2011). Several studies have prioritised areas that are expected to resist climate change (i.e. refuges) or areas of low vulnerability, to develop conservation plans that do not rely on explicit range shift projections (Graham et al. 2010;Shoo et al. 2011;Crossman, Bryan & Summers 2012). Others have taken two or three time slices and attempted to minimise (effective) distance between protected areas that are predicted to be useful at different times (Carroll, Dunk & Moilanen 2010;Faleiro, Machado & Loyola 2013;Kujala et al. 2013;Makino et al. 2014). One method has been proposed that is capable of selecting reserves that form functionally connected strands through time and space for multiple species (Phillips et al. 2008), but it is highly reliant on assumptions that cut the problem down to a computationally manageable size: nearest neighbour dispersal, fixed generation time, a hard climatic suitability threshold, only one climate projection and limited potentially protectable cells.
More generally, it is difficult to adapt spatial conservation prioritisation methods to scenarios where the pattern of the landscape could be fundamentally changed by habitat restoration. In landscapes that are much impacted by humans, where a small amount of (semi-)natural habitat remains, deciding where to restore is a larger optimisation problem than deciding what to conserve (simply because there are more possible landscape configurations). Approaches to this problem so far have either started by defining a limited number of sites that could be restored or they have assumed that the benefits of restoring any site do not depend on whether other sites are restored (Crossman & Bryan 2006;Thomson et al. 2009;Wilson et al. 2011;Tambosi et al. 2014;Crouzeilles et al. 2015;Polyakov et al. 2015). These studies all have some consideration of connectivity, or at least the distance to existing habitat, but most do not consider the spatial relationships among restored patches (but see Clauzel, Bannwarth & Foltete 2015). Connectivity has long been recognised as an important factor in restoration decisions (Hobbs & Norton 1996) not least because only part of the community will be restored by planting and introductionmany other species are expected to arrive by natural colonisation.
If it is possible to find a property of an existing network, which indicates where the addition or removal of a patch will have the biggest effect, then this can provide an invaluable 'shortcut' in reserve selection algorithms. It reduces or removes the need for searching procedures that propose and test multiple changes to the network, as are used in most general-purpose optimisation tools such as simulated annealing and integer programming. Even though methods that scan through the existing network can be fast on modern computers , a method that pinpoints the best place to act is almost certain to be faster. The software ZONATION already makes use of such a shortcutit assumes that sites with high local connectivity will have the most detrimental effect on other sites if they are removed (Moilanen et al. 2005). This is a reasonable assumption if the landscape change is gradual and there is no need for very long-distance connectivity as imposed by climate change.
In this study, we test metrics that are hypothesised to predict the marginal change in the ability of an entire network to facilitate range expansion. We use the fact that range expansion success can be predicted by an analogy between a metapopulation system and an electrical circuit, which takes into account both the rates of colonisation between patches and the rates at which occupied habitat produces new emigrants (Hodgson et al. 2012). We show how our metrics, which are simple arithmetic manipulations of the circuit solution, can be used in iterative algorithms to add or remove habitat from networks in a very efficient way, to assist decision-making.

T H E C I R C U I T A N A L O G Y T O R A N G E E X P A N S I O N
We imagine a scenario where a species is able to extend its geographic range into a landscape that has recently become suitable. We characterise the habitat as a set of patches or cells, each capable of supporting a viable population. We know the set of patches that are initially occupiedthe sourceand we wish to determine the time until at least one individual will colonise some distant patch (es)the target. We set parameters for the rate of emigration from each patch, if and when it becomes occupied, and the probability of dispersers colonising every patch from every other one. Hodgson et al. (2012) showed that the time for the species to reach the target is well approximated by the overall resistance of a circuit with patches as nodes and colonisation times as links. The approximation is weakened if there is a substantial chance of extinction of patches during the course of the expansion, but we argue that it is still a very useful metric in the context of climate change.
Conductance is the inverse of resistance and is calculated as where c is the single-step conductance (rate of colonisation) between any two locations, i and j index the N intermediate patches, l indexes the N patches together with the source and target, and d ij is [0 if i 6 ¼ j and 1 if i = j]. c target and c source are length N vectors of the conductance values between each intermediate patch and the target or source, respectively; if there are several locations in the source or target, the contributions from each are simply added together to produce each element c.
In this study, we assume an uninformed dispersal process, where every disperser has an equal chance of colonising a new patch if they land in it, and their probability of landing in it is given by a negative exponential dispersal kernel. We refer to the mean displacement of our negative exponential over all 2-D space as the dispersal distance. Throughout this study, we used a dispersal distance of 1/32 of the landscape width.
All analyses were carried out in R 2.15.2 or later with matrix inversion using the 'solve' function (which calls LAPACK) with tolerance exp (-256).

R E L A T I O N S H I P B E T W E E N H A B I T A T A R E A A N D C O N D U C T A N C E
There is bound to be a strong relationship between the conductance using our circuit analogy and the total area of habitat in the landscape between the source and target, because adding habitat adds more alternative routes for the species to colonise, without weakening any of the pre-existing ones. We investigated the shape of this relationship when habitat is distributed completely at random, or in an idealised, straight 'corridor', using a square landscape of 128 by 128 cells. The source and target each consisted of two rows of 128 habitat cells immediately beyond the south and north borders of the landscape. The amount of intervening habitat was varied between 0Á8% (128 cells) and 100%. These results help to define an expectation against which to judge the performance of the methods of network manipulation discussed below.

F R A C T A L -B A S E D L A N D S C A P E S T O T E S T M E T H O D S
We generated 30 spatially autocorrelated, fragmented landscapes of habitat on a 256 9 256 cell grid using a fractal algorithm (Chipperfield, Dytham & Hovestadt 2011), henceforth termed experimental landscapes. Maps and details of the steps involved are given in Appendix S1 and Figure S3.1 (Supporting information), but two features are particularly relevant to understanding the results. First, we defined habitat by picking the highest ranked cells out of a continuous surface; thus, we can produce different nested versions of each landscapes' pattern with any number of suitable cells. For this study, we used 1024 and 2048-cell versions of each of the 30 landscapes (1Á5% and 3% habitat, respectively). Secondly, the landscapes had a wide range of starting conductances ( Fig. 1). Some had almost continuous corridors of habitat between the source and target (northern and southern borders) and others had habitat-free gaps of up to 100 cells. The source and target for all analyses using these landscapes consisted of four rows of 256 habitat cells immediately beyond the south and north borders of the landscape.

V A L U E O F E X I S T I N G C E L L S R E L A T E D T O T H E C U R R E N T F L O W T H R O U G H T H E M
It follows from the way we have defined our circuit that each node in the circuit, that is each habitat cell in the landscape, is maintained at a particular potential (m i for the i'th node). The current flowing between any two nodes i and j is given by c ij (v j À v i ). The amount of current flowing into cell i from the source or other cells with higher potential is strictly equal to the amount flowing out of cell i to the target or other cells with lower potential. Taking the sum of the positive flows gives us a measure of the role that cell i plays in the circuit. In a different circuit theory analogy modelling random walkscurrent flow is related to the probability that the walker passes through the node in question (McRae et al. 2008). McRae et al. (2008) argued (but did not prove) that this flow metric should be correlated to the drop in conductance of the entire circuit that would occur if cell i was lost. We tested this hypothesis by dropping cells one by one from our 30 experimental 2048-cell landscapes (described above). For each intact 2048-cell landscape, we ranked cells by their flow and recorded conductance after dropping the highest ranked, the lowest ranked or every 16th cell in between.

I T E R A T I V E D R O P P I N G
If cells with low current flow through them contribute little to the overall conductance, then this fact can potentially be used to find efficient, or even optimal, arrangements of habitat, out of a starting set of patches. We tested the efficiency of a simple procedure of iteratively dropping the cell with the lowest flow. We started with our 30 experimental 2048-cell landscapes and continued dropping until 1024 cells remained (50%). We compared this procedure to a contrasting procedure which drops the cell with the lowest local connectivity at each step, where local connectivity of each cell i is defined as P j c ij . This is analogous to the dropping procedure used in ZONATION (Moilanen et al. 2005). We also compared the end product landscapes to several replicate landscapes where a random 50% of cells had been dropped.

H A B I T A T B O T T L E N E C K S R E L A T E D T O H I G H -P O W E R L I N K S I N T H E C I R C U I T
If one is planning to restore or create habitat in a fragmented landscape, the number of locations that could be chosen from can be staggeringly large. To find the most efficient places to add a small amount of habitat, one could fill up the whole landscape and then apply iterative dropping (as above) until only a few patches remain, but this would be very time-consuming. Also, when a landscape is very well covered by habitat, the current flow through all cells tends to be very similar (data not shown) and so the dropping procedure may not adequately distinguish between them.
The best place to add a small amount of habitat to a fragmented landscape to improve range expansion should be where there is currently a 'bottleneck' in habitat availability. We intuited that the most For reference, we also show the line of equality (grey), the line y = sqrt(x) (black; this is the expectation if cells are distributed randomly, see Fig. 2, but our points tend to be below this line because the landscapes are spatially autocorrelated), the conductance obtained when the same number of cells are distributed in an even lattice (blue) and the conductance obtained when the same number of cells are distributed in a single 'corridor' (red). This illustrates that we are testing our methods on a very wide range of landscapes (maps of all landscapes are also given in Fig. S3.1).
important bottleneck is likely to be along a route where there is already relatively high current flow, but in the section of that route where there is highest resistance, and that these locations could be identified as the links in the circuit with high-power output. Power is defined as current multiplied by potential difference, and thus in our circuits, the power of the link between nodes i and j is given by P ij = c ij (v j À v i ) 2 . In Appendix S2, we show that the power of a circuit link is the dominant term in a formula for the improvement in overall conductance when that link is strengthened (by halving its resistance).
We tested whether link power indicated efficient places to add habitat by recording the change in overall conductance that resulted from adding a single cell to our 30 experimental 1024-cell landscapes either: • midway between cells i and j where P ij was among the top 128 links in the circuit (testing all 128 allowed us to investigate some extra correlations, but only merged results are shown in this study); • midway between randomly chosen cells i and j where P ij was not among the top 128 links in the circuit; • in one of 64 evenly spaced points across the landscape (regardless of the distribution of habitat).

I T E R A T I V E A D D I N G
We tested the efficiency of a simple procedure of iteratively finding the maximum P ij in the circuit and then adding one cell midway between cells i and j. We started with our 30 experimental 1024-cell landscapes and added 24 cells by this procedure, recording the overall conductance at each step. We compared our iterative adding strategy to a simple strategy requiring no circuit calculations: adding 16 new cells equally spaced in the (north-south) column in each landscape that initially contained most habitat cells (this guarantees that there is a path across the landscape with gaps of <16 cells).

R E L A T I O N S H I P B E T W E E N H A B I T A T A R E A A N D C O N D U C T A N C E
Conductance increases as more habitat patches are added to a landscape. For an idealised cellular landscape, if habitat cells are added at random, conductance tends to be proportional to the number of cells squared (Fig. 2 open symbols). If habitat cells are added to form a continuous corridor of increasing thickness, then conductance increases linearly with the number of cells (Fig. 2 solid symbols). The linear and quadratic relationships hold almost perfectly when the overall proportion of habitat is above about 10%. At lower proportions of habitat, the conductance tends to fall below what would be expected from extrapolation of these relationships (Fig. 2b).

V A L U E O F E X I S T I N G C E L L S R E L A T E D T O T H E C U R R E N T F L O W T H R O U G H T H E M
The current flowing through a cell is generally a good indicator of the difference in conductance that would result if that cell were dropped from the landscape (Fig. 3). We can be very confident that cells with among the lowest current flows are those we can best afford to drop, because over all 30 experimental landscapes, dropping a single cell of low rank never has a large effect (Fig. 3 LHS). For most landscapes (>75%), but not all, the most detrimental cell to lose is the cell with highest flow (Fig. 3 RHS). When looking at absolute values rather than ranks, an important rule seems to hold that the reduction in conductance of the entire landscape after dropping is never greater than the current flowing through the dropped cell (although it can often be less, Fig. S3.2). Therefore, it seems that the relationship between cell flow and cell marginal value is strong enough to be useful for prioritisation.

I T E R A T I V E D R O P P I N G
By following a simple routine of iteratively dropping the cell carrying the least current flow, overall conductance can often be preserved despite severe habitat loss (Fig. 4). This routine is always much better than the random expectation (compare black lines with blue in Fig. 4). By contrast, if the cell with the lowest local connectivity is dropped each time, then conductance declines in a way that is consistently much worse than random, at least for our fractal-based landscapes (red lines in Fig. 4). Dropping based on current flow is very efficient irrespective of the initial conductance (which varied because of the patterns of our fractal landscapes, see Fig. 1). At the point where 25% of cells had been dropped from our 2048-cell landscapes, the proportion of conductance lost was never more than 6% (Fig. 4). The spatial pattern of dropped and preserved cells is intuitively reasonable for conserving connections between the source and target (south-north in our examples; Fig. S3.3g-i).

H A B I T A T B O T T L E N E C K S R E L A T E D T O H I G H -P O W E R L I N K S I N T H E C I R C U I T
The power output of links in a circuit is generally a good indicator of the regions where habitat addition could make most difference to the overall conductance (Fig. 5, Fig. S3.3j-l). For some of our 1024-cell landscapes with relatively low starting conductance, adding a single cell in a strategic location could increase conductance by many 1000-fold. Adding a cell to the mid-point of a link with among the highest powers in the circuit (i.e. in the top 128 links and cumulatively contributing over 90% of the total power) was very likely to improve the overall conductance (Fig. 5 black). By contrast, adding a cell to the mid-point of a randomly chosen link in the circuit (Fig. 5 red) or to one of 64 evenly spaced points across the landscape (Fig. 5 blue) very rarely increased conductance substantially. Figure 5a focuses on absolute improvements in landscape conductance; for example, 43% of the locations chosen by power increased conductance by more than twofold, but only 8% of random and 7% of even locations exceeded the same threshold. Figure 5b shows the same data in terms of the maximum observed improvement for each landscape; for example, 93% of the locations chosen by power achieved more than 5% of the maximum observed improvement for the landscape, whereas only 12% of random and 8% of even points exceeded the same threshold.

I T E R A T I V E A D D I N G B A S E D O N P O W E R
By iteratively adding a cell to the mid-point of the link with highest power, conductance could be increased far more quickly than the random expectation. Assuming a quadratic relationship, the random expectation would be that adding 24 cells to a 1024-cell landscape would increase conductance by 4Á7%. The minimum improvement that we observed was 24% and the median 3,800%. Landscapes with lower initial conductances generally benefitted more from the iterative adding of a few cells (Fig. 6a), such that the conductances of all landscapes converged. After adding 16 cells, the power-based strategy always outperformed the simple strategy of adding 16 new cells in a single column. For four out of 30 landscapes, adding just one cell with the power-based strategy outperformed the 16-cell simple strategy; on average, it achieved greater conductance than the simple strategy after the addition of only 6 cells (Fig. 6b). The powerbased strategy tended to create chains of stepping stones that efficiently crossed gaps in the initial pattern of habitat ( Fig. S3.3m-o).

Discussion
In a circuit analogy to a range expansion process, we have demonstrated that patches with highest current flow are generally those that it would be most detrimental to lose, and links with highest power are generally across gaps where it would be beneficial to add habitat. Previous studies using a different circuit analogymodelling the likely tracks of individual dispersershave taken current flow as a metric of importance for connectivity (McRae et al. 2008;Carroll, McRae & Brookes 2012); however, they have not tested this assumption. To our knowledge, we are the first to propose the metric of link power as an indicator of places where additional nodes should be added to a circuit.  There are three important methodological advantages to the metrics of current flow and power we have explored here. First, we have demonstrated that they can prioritise much better than random, and also better than a plausible alternative strategy, for landscapes with realistic spatial patterns. Secondly, they have a transparent relationship with the overall landscape conductance, which we know is an indicator of how quickly a species will be able to shift between the source and the target. This contrasts to another novel development in spatial planning softwarethe ability to prioritise 'corridors' in ZONATION where corridors are defined by structural rules which may or may not equate to increased population sustainability (Pouzols & Moilanen 2014). Thirdly, their calculation takes a negligible amount of time on top of the time taken to solve the circuit initially. This means that it can potentially be computationally much more efficient than any method that relies on an exhaustive search of either patches that could be dropped or barriers that could be removed, for example the barrier detection method in Circuitscape (McRae et al. 2012).
Although our metrics can help to find efficient landscape arrangements, there is still room for improvement. Ranking patches by current flow is not identical to ranking them by the marginal change that would occur if they were dropped. However, when discrepancies arise, it is often in cases where the marginal changes are so small that it probably would not matter which patch was chosen for deletion. The more patches are removed from the landscape through iterative dropping, the more flow will be concentrated into the remaining patches, and this should help to distinguish the few that should really be accorded the highest priority.
Intriguingly, Cowley, Johnson & Pocock (2015) recently proposed that the marginal change in conductance when a patch is dropped can be predicted by summing the power of all the links connected to that patch, based on proving upper and lower bounds (Cowley 2015). Cowley et al. also state that the sum of power is a better metric than the current flow, although they do not quantify this. Depending on how much better this metric proves to be, it could form the basis of a better dropping algorithm. However, we speculate that the lowest ranked patches picked by current flow or by summed power are likely to be similar. The differences may become more important if using the metric to allocate extra conservation effort to the highest ranked patches, or if aiming to delete the highest ranked patches to stop invasive species spread (Cowley, Johnson & Pocock 2015).
Any dropping algorithm attempting to find optimal habitat arrangements will be limited by the fact that the user must specify which habitat is available for dropping and that it will become slower, the more habitat patches are available. Therefore, we think that the finding of bottlenecks and developing an iterative adding routine based on power are our most important advances. Note that finding the highest power link in the circuit is different from finding the highest power node [by summing all the links connected to each node as in Cowley, Johnson & Pocock (2015)], and they may indicate different things.
Even though our power metric can radically improve landscapes, the ranking of links is not identical to ranking of marginal changes when patches are added, so it is possible that an even better indicator could be found. Our iterative strategy always took the single link with highest power and added a new patch at the mid-point of this link. Although this is very simple to automate and we show that it performs well, a better strategy might place new habitat according to the ensemble of links that give, say, 90% of the power in the circuit. A possible alternative strategy for adding habitat is to look for large drops in voltage. This has been proposed as a metric for random walk-based models and apparently originated in the electronics design literature . However, we suspect that it would not work very well in our particular circuit analogy, because every cell is linked to every other, so there are always many links that connect cells with widely differing voltage. Of these, some will be the high-power links (recall that power is voltage difference times current), but many will be links with such high resistance that they contribute virtually nothing to the overall conductance.

P O T E N T I A L A P P L I C A T I O N S
Throughout this study, we have used the same source and target locations and the same dispersal kernel, because our focus was on the extent to which algorithms could optimise the spatial arrangement of habitat with respect to a single outcome: overall conductance. There are broadly three ways in which practitioners could parameterise such a model to help solve a practical landscape planning problem: parameters (including the definition of habitat and the source and target locations) could be chosen to represent a single species of particular interest; they could be a rough approximation of the needs of many species using the same habitat (Krosby et al. 2015); or the model could be used as a building block in a larger systematic conservation planning exercise for multiple species. The suitability and feasibility of these alternative approaches will depend on data availability, computational resources and the extent to which the model's assumptions are stretched, much as is the case for any of the other network and connectivity metrics currently in use.
In the most ideal case, if we assume that conductance measured in one direction across a network is the only thing that needs to be maximised, and parameters are well known, then power and current flow as described here could already be useful for decision support. For instance, if planners want to choose between a small number of defined landscape scenarios, they can simply score them using the overall conductance. If a limited portion of the landscape is feasible for habitat restoration, and planners want to find a cost efficient subset of this area, then backwards iteration based on current flow can automatically rank the potential habitat cells in a similar manner to ZONATION (initially imagining that all feasible areas are restored). If planners want suggestions of the ideal place to restore with no constraints, or if the amount of restoration that can be afforded is orders of magnitude less than the amount theoretically feasible for restoration, then we suggest using forwards iteration based on power. JAH and DWW have developed a software package that will make it as easy as possible for conservation practitioners to perform such analyses: Condatis (Wallis & Hodgson 2015). Condatis version 0Á6Á0 included simple scenario comparison, iterative dropping, visualisation of the highest power links and manual adding of habitat by the user. However, automated iterative adding, the last method described here, is not yet implemented.
A more realistic conservation planning problem is likely to involve many species, associated with a variety of dispersal kernels, a variety of source and target locations and a variety of habitat types that are present in the same landscape. Therefore, the next steps are to develop these computational methods so that they can be used to prioritise the needs of multiple species simultaneously. Conceptually, if we assume adequate data are available for all species, the methods to solve the problem are well-established in systematic conservation planning: they require translating the conservation benefits for each species into the same 'currency' and then performing either a targetbased prioritisation or a benefit maximisation (Moilanen, Wilson & Possingham 2009). However, data are not available for many species and are only approximately known for others. Therefore, it is important to investigate how robust optimisation results may be to variation in the input data. High robustness can be useful because it can make decision-makers less concerned about uncertain parameters, and it can save computation time by allowing some species to be used as surrogates for others. For example, if there is a strong correlation between the best solution for a strong disperser and a weaker disperser in the same landscape, then a single dispersal kernel can be used to represent a range of species dependent on the same habitat. However, high sensitivity (the opposite of robustness) is desirable when it reflects real biological variation: for example, we would not expect the same conservation plan to be optimal for species that are predominately moving south-north and those that are moving east-west. The sensitivity of the optimal restoration solution to the choice of source and target locations and dispersal distance needs to be thoroughly tested, especially if using the approach of a 'generic species' to plan for all species reliant on a habitat. Once sensitivity has been tested over a wide range of realistic species and landscape scenarios, some rules of thumb may emerge, but it could be that sensitivity itself depends on the conservation scenario, and so, it needs to be tested in each application to find robust solutions. On a parallel front, more research is needed to improve computational efficiency so that many species can be included in one analysis, in cases where a few surrogate species will not give adequate results. So far, the model we use is highly simplified version of a real range expansion process and is based on a simple metapopulation model. Assumptions of this model include that the extinction rate is negligible and that patches, once colonised, become fully occupied within one time step. As discussed in Hodgson et al. (2012), it is important to prioritise conservation for resistance to extinction as well as rapid colonisation, and we suggest doing this by combining our methods with other established metrics of viability. Some other assumptions we have made here could be relaxed to make the method more flexible. For instance, it would be straightforward to replace the negative exponential dispersal kernel with any other distribution shape. If the dispersal process is impeded by barriers between some patches (and not others), it is still possible to solve for conductance by customising the resistance between patches. However, with iterative modification of the landscape, especially if habitat restoration breaks through a barrier, the recalculation of these customised dispersal probabilities between every pair of patches could be very time-consuming. Despite some limiting assumptions, our model is still more closely aligned to population biology than many reserve selection methods, because they often implicitly assume that all habitats are occupied and use structural connectivity measures.
In conclusion, conductance, flow and power as defined here are useful indicators that have a mechanistic underpinning. They specifically identify the types of spatial pattern likely to facilitate range expansion, and thus, they differ from many other connectivity indicators in the literature that sum or average individual link strengths. They have great potential to be used in practical landscape restoration planning, where effective prioritisation is very important because of the high costs of restoration and the competing demands on land.