- 1.1 I would split chapter 6 into input versus output (visualization). The former, I would add to chapter 2, while the latter could be a little expanded by some of the material taught at Use R! conferences. Yes, I see that there is chapter 9 but data output is not limited to map making. Chapter 2 should then be split into two. 2a would be just about vector and raster data, and 2b about coordinate reference systems, units, data input.
- 1.2 Unfortunately, the same (lucid and easy to read) cannot be said about the examples in part II. For example, the osmdata::getbb command on page 155 specifies an output format that is not known to the osmdata package, which in its latest version (as of Feb 2018) allows only for data.frame, matrix, string, or polygon. [RL]
osmdata::getbb()
was in the development version of osmdata. Was fixed on 2018-02-22 when the osmdata 0.0.6 was sent to CRAN - see https://cran.r-project.org/package=osmdata
Todo: We will look into improving the other examples [RL]
- 1.3 After two Human geography examples, I am missing some physical geography in Part II. Please touch base with Edzer Pebesma and his group to see whether they would share an example or two. [RL, JN, JM - to discuss]
- 1.4 [RL, JN, JM - to discuss] I would leave these sections (5.2) where they are. The topic is important in and by itself and easier to find (even for lookup purposes if the book is used as a reference) if we leave it in section 5.2.
- 1.5 [RL, JN, JM - to discuss] (chapter 2) No, please leave it as is. Similar to ASDAR, I believe that it is pedagogically much easier to explain how GIS data is structured by looking at the code. Which is why I have been using ASDAR and would now use your book even in an Intro to GIS class.
- 1.6 [RL, JN, JM - to discuss] Transportation and raster don't really go together. Plus, there is the point you raise about length. I rather you add a physical geography chapter in section II and have that one focus on raster operations. I am, for example, working on greenhouse gas emissions and if one used that example, plus Pebesma's temporal extensions (spacetime, st, stars), one could have a very rich chapter.
- 1.7 [RL, JN, JM - to discuss] In general, I am concerned that you are trying to do too much. The book would already be good if you ended with chapter 9 (which would be 10 if you followed by advice of splitting chapter 2 into two). The bridges to GIS software are but kind of 'sugar on top'. Yes, I very much agree that a big data chapter would be more valuable than the raster/vector interactions. Chapters 11-15 are too much. I fear that this will cost you a lot of time and by then you will have to rewrite parts of the other chapters again because the R world is spinning so fast. The stars package, for example will open a whole new world of n-dimensionality and working with non-local BIG data. Leave writing a GeoComp Applications book to others and get this one out fast! Then prepare for a second edition in 2019. ;)
- 1.8 [RL, JN, JM - to discuss] In the thank-you note to Josh O'Brien on page 101, the URL to the discussion on stackoverflow is missing.
- 1.9 [RL, JN, JM - to discuss] Please insert a code snippet that results in the table 5.2 - it is a bit tedious to look this up interactively.
- 1.10 [RL, JN, JM - to discuss] I love section 5.3 on clipping; here you do exactly what I am missing in part II of the book: you are providing the code that shows how to create the figures in each chapter.
Chapter 7 [RL]:
- 1.11 It would be nice if the code for generating figures would be inserted immediately before those figures show up in the text. I know, you promise to show how to do this in chapter 9 but this is really frustrating. So, please insert the code here and add a note that the detailed instructions will be expanded on in chapter 9.
- 1.12 The rail example on page 166 is not very convincing: lots of effort for very little outcome.
- 1.13 The use of SpatialLinesNetwork on page 168 is incorrect; the code does not work as indicated.
Chapter 8 [JM]:
- 1.14 Location analysis and geomarketing are not the same! I like the comparison between (landscape) ecological analysis and location analysis though. - After we download and show the contents of the file on page 175, it would be good if you translated the terms displayed as they are in German.
- 1.15 As in chapter 7t would be nice to have the code describing how the figures are created in inserted here.
- 1.16 At the end of the reversed geocoding on page 182, I would add a View(coords) command to see the coordinates derived.
- 1.17 The translation into metro names does not work as depicted. I don't get Leipzig nor Velbert.
- 1.18 Quite frankly, this section 8.5 is not convincing - even if it were working, it is a lot of effort for little outcome. Plus, it does not add to the purpose of this particular chapter (location analysis). I recommend to just drop section 8.5.
- 1.19 Do you realize that you make readers download some 2 GBytes of data in this little script on page 183?! That is not fair! - And then, why are you translating the resulting list into a raster object? To use a field representation makes no sense here; besides, it is very wasteful, creating a large raster with lots of N/As (whitespace). This might make sense in a parallel landscape ecological application but does not fit an audience interested in location/allocation analysis.
Chapter 5:
- 2.1 [RL] Page 98, 2nd paragraph – Didn't understand why was 'geosphere::alongTrackDistance' used to calculate 1 degree distance, instead of 'geosphere::distGeo' like on the previous page?
- 2.2 [RL] Page 101, 1st paragraph – UTM zones are also divided by latitude zones, not just by longitude, with the most commonly used division is to north (N) and south (S). This should be mentioned to avoid confusion. For example, the 'rgdal::make_EPSG' list has 120 projections matching the description 'WGS 84 / UTM zone', rather than 60, because each of the 60 zones has a separate north and south variant.
- 2.4 [JN] Page 108, Table 5.1 – Perhaps the heading should say this applies to the specific 'cat_raster' example. The current heading may be interpreted as though the table applies to any given NAD83 and WGS84 rasters.
- 2.5 [JN] Page 109, Table 5.2 – Same as previous comment.
- 2.6 [JN] Page 112, 4th paragraph – It should be mentioned what is the algorithm used by the 'st_point_surface' function. For example, is there a random component or is it deterministic?
- 2.7 [RL] Page 113, 1st code section – Perhaps it was mentioned elsewhere; if not, text should emphasize that the distance arguments are given in CRS units, or as 'units' objects.
- 2.8 [JN] Page 114, 1st code section – Better to use 'st_geometry(nz)' rather than 'nz$geometry'. The latter is specific to the case when the geometry column is named "geometry", the former is general for any 'sf' layer.
- 2.9 [RL] Page 118, 2nd paragraph – Seems like there is missing text, as the paragraph starts with "There are two different ways to subset points..." but I can't find where these ways are described.
- 2.10 [RL] Page 118, last paragraph – The paragraph and following code section (on aggregation with 'dplyr') are repeated on the next page, so should be omitted here.
- 2.11 [JN: It due to our decision to avoid the use of exponential notation.] Page 119, Figure 5.14 – The color key in this figure refers to population, but units are in "mln" which on first glance can be confused with "min", e.g. "0 mln to 20 min". Why not use exponential notation, e.g. "0×106 – 20×106"?
- 2.12 [JN] Page 119, 3rd paragraph – Perhaps it would better to start Section 5.3.7 with an introduction on what geometry transformations are useful for. The next page gives one example – splitting a MULTI* geometry so that separate attributes can be given for each individual part. Another very common use is transforming a series of points into a LINESTRING, as shown on the left in Figure 5.15. For example, this is used on chronologically ordered point observations of a moving object, person or animal, such as in GPS device recordings or geo- referenced social media posts. If there is room maybe another example can be added where a collection of points referring to different objects are combined to MULTIPOINT by object, then casted to LINESTRING geometries per object.
- 2.13 [RL, JN, JM - to discuss] Page 123, code sections – The code sections on this page mix 'sf' and 'raster' objects in the same expressions. However, currently, 'raster' (Version 2.6-7) is officially only compatible with 'sp' objects. Thus, for example, the fact that function 'rasterize' works with the 'sf' object named 'us_states' can be viewed as a coincidence. The problematic nature of this situation is unmentioned as far as I can see. Unless the compatibility is clearly discussed (i.e. why does this code work? In which other cases using 'sf' objects in 'raster' function will not work?), less experienced readers may be under the impression that 'sf' and 'raster' are undoubtedly compatible, and will be surprised when encountering different cases. In my opinion it would be even better to avoid the confusion altogether by converting an 'sf' object to 'sp' each time it is used in conjugation with a 'raster' function, using 'as("Spatial")'. Though this makes the code a little longer and less elegant, it emphasizes the fact that 'raster' is currently compatible only with 'sp' and gives a clear universal rule for the reader, i.e. always convert to 'sp' data structure when using a vector layer inside a 'raster' function. Relying on incidental compatibility between 'raster' and 'sf' is also a little more dangerous in terms of keeping the code sections in the book up-to-date. If the underlying structure of these two packages changes, any undocumented compatibility may disappear. For documented methods this is less likely. (Unless there is something I'm missing here – such as the intention of 'raster' authors to maintain the above-mentioned compatibility and make it officially documented in the future).
- 2.14 [JM] Page 125, 1st paragraph – I thought the introduction to section 5.4 mixes several concepts and thus may be unclear. First, georeferencing with ground control points and orthorectification (first two bullet points on page 125) are very common in spatial analysis, especially in remote sensing, however they are usually done in desktop GIS software such as ArcGIS or ERDAS. The process of locating control points, choosing the right transformation model, and evaluating the registration error are inherently interactive and thus unsuitable to be done with R, or any other programming language for that matter. Along with digitizing new data, this is perhaps the most notable example of a spatial analysis step that does not fit into an automated non-interactive work-flow. I think this needs to be mentioned to give a fuller picture of R's strengths and limitations in spatial analysis. Second, section 5.4 goes on to review diverse raster geometry manipulations, such as changing the extent, aggregation, resampling and vectorization. Perhaps a short introduction can introduce these operations and say when are they necessary (though some examples are given later on, such as resampling to align different rasters prior to spatial analysis).
- 2.15 [JM] Page 127, 1st paragraph – Should be more specific: moving towards (0,0) starting from a cell center? Or cell corner – if so which one?
- 2.16 [JM] Page 128, footnote – An example of temporal aggregation can also be placed in this section. Even repeating the example from the docs of function 'raster::stackApply' can be useful to demonstrate the point.
- 2.17 [JM] Page 129, 1st paragraph – The fact that bilinear disaggregation gives back the original raster values was not intuitively clear to me. I even felt the need to run the code and see that it is indeed correct (and it was). In practice aggregation usually leads to information loss, and the original raster can no longer be reconstructed. Perhaps replace this example with a more realistic one, such as using a raster with random values.
Chapter 7 [RL]:
- 2.18 Page 156, 2nd code section, Page 158, 1st code section and other places – For uniformity of using the pipe operator, perhaps 'bristol_od %>% group_by(o)' is preferable over 'group_by(bristol_od, o)'. In other words, if piping is being used why only starting from the 2nd expression rather than from the 1st? This appears in many other places in the book.
- 2.19 Page 156, 3rd paragraph – May be interesting to mention how trip data are collected, in addition to specifying the source, for those not familiar with this. For example, are the data based on a survey? If so, how many people comprised the sample in each zone?
- 2.20 Page 161, 1st paragraph – The authors mention computational intensity and storage as disadvantages of routing using a local machine. This is a secondary consideration; I'd say the primary disadvantages are that for representing (1) complex network relations, such as one-way streets or turn restrictions, (2) temporal dynamics of speed reduction due to traffic and (3) differences due to mode of travel (car, public transportation, etc) it is both difficult to obtain the underlying data and to implement it in a routing solution (e.g. requiring specialized software such as 'pgRouting' and PostGIS). Global coverage is mentioned but is less relevant for routing on a local machine, because most studies or applications that require routing are done on local scales anyway.
- 2.22 Page 164, 2nd paragraph – The analogy of a road network and a graph should perhaps be given in a more formal way. For example, using an illustration with a schematic graph representation on the one hand (using the terms vertex, edge and edge weight) and a geographical map representation on the other hand (using the terms roads, bus stops and geographical distance, respectively).
- 2.22 The chapter is good as is, but if there is room then adding a raster-based application is beneficial. However, not sure I understand your proposal on the cycle parking places raster. Cycle parking places are a point phenomenon, thus making a surface out of them seems unnatural. An alternative suggestion is to calculate an accessibility or travel time raster using the 'SpatialLinesNetwork' object. Perhaps this can be done in a loop going over all raster cells, finding the shortest path from the nearest node to the chosen destination each time. The result may not look pretty if the road network is too simple, but this can make the analysis be more applicative and appealing. Travel time rasters and derived isochrones are common in GIS analysis and will be familiar to many readers who find interest in Chapter 7.
Chapter 8 [JM]:
- 2.23 Page 175, bottom – The content alignment seems to be wrong, was this supposed to be a table?
- 2.24 Page 179, 1st code section – Here and later on in the chapter 'map2' from package 'purrr', is being used to iterate a function over an object. These two places are the only time 'purrr' is mentioned in the book, and as far as I can see the 'mapply'-like approach is not really used in any other chapter in the book. This is a potential source of frustration for less experienced readers:
- Unless the reader has few years of R coding experience, the whole concept of 'mapply' cannot be grasped without a detailed, gradual introduction. However, is it really necessary in a book on geocomputation? To the beginner, 'for' loops are much more intuitive and easily explained. With 'for', the reader can also see what's going on inside the code much more easily, for example by running the code inside the 'for' loop when manually setting i=1, i=2, etc. I understand the desire to demonstrate powerful techniques. I would use the same in my own analysis. But the point of view of the reader needs more thought – is it worth the risk of getting 50% of readers stuck and losing their grasp of the material? If the 'mapply' concept is used regardless (instead of a 'for' loop), it really needs careful thought and much more detailed introduction. Otherwise, I would consider modifying the iterative examples into 'for' loops in this Chapter. When the reader gains deeper understanding on iteration – perhaps from general R programming books – he can naturally shift from using 'for' to using 'mapply' in his geocomputation work-flow, at hist own pace.
- In case the authors still prefer the 'mapply' approach (and hopefully provide a detailed introduction for beginners) it is unclear why the 'purrr' package is introduced, rather than 'mapply' or 'Map' from base R? Again, any complication needs to be justified, or else the simpler approach should be preferred. The fact that 'purrr' has certain advantages is irrelevant unless they are demonstrated or at least discussed in the book.
- 2.25 Page 180, 3rd code section – The use of 'st_union', 'st_cast' and 'st_intersects' and the complication of manually deleting a polygon can be avoided altogether by using 'clump' from package 'raster'. Namely, one can execute 'clump(pop_agg[pop_agg > 500000, drop = FALSE])' followed by 'rasterToPolygons', which will dissolve individual cells by clump ID.
- 2.26 Page 184, 1st code section – Same comment from above for the 'mapply' function applies for 'reduce' as well. Even more so, because 'reduce' is much more rarely used in practice and thus will be even less familiar. Majority of readers will probably not be familiar with the function or the whole concept, so it needs an explanation if used. And, if it is, again – I don't see what is the advantage of introducing 'reduce' as well as an external package 'purrr', over the more familiar 'do.call' and 'rbind'.
- 2.26 Page 185, 1st paragraph – Maybe I missed something, but not sure I follow the rationale of having high count of bike shops contributing to the score of desirable shop location. Shouldn't shops be placed where there are less existing shops, reducing competition for customers, thus giving higher weight to cells with lower shop count?
Other:
- 2.27 [RL, JN, JM - to discuss] One idea that comes to mind in terms of additions is to try adding a chart, “cheatsheet” or inforaphic of the various spatial analysis methods covered in this book, perhaps with thumbnails illustrating the types of operations and tasks, referring to the corresponding book sections. In other words – making a visual index. The book, as it seems, will become the most comprehensive textbook on spatial analysis in R. Supplementing it with some kind of visual index summarizing the ecosystem of currently available tools can bring additional value, especially to those using the book as an introduction to the field. I understand this requires some thought and may or may not be eventually feasible, so this is just a thought.
- 2.28 [RL, JN, JM - to discuss] In case the authors decide to add a third chapter to part II, considering existing R-Spatial resources, I would suggest one of the following – (1) Species distribution modeling – for example, collecting environmental variable rasters, aligning them into a single multi-layer dataset, fitting a logistic regression (or different model), making predicted distribution maps and evaluating them. (2) Using OpenStreetMap data – though this is mentioned in Chapter 8, there are many more detail that can be covered, while introducing the capabilities of package 'osmdata'. (3) Remote sensing image analysis – e.g. calculating spectral indices, unsupervised and supervised classification, object detection, etc. (4) Social network data analysis – Such as analysis of geo-referenced Twitter and Flickr posts, for example: evaluating user mobility and speed based on chronologically ordered posts, spatio-temporal social network activity patterns, social ties occurrence across spatial boundaries, detection of textual reference to place and event detection, etc.
- 2.29 [JN] Page 27, last paragraph – Are you sure that "Simple features in R can take on one of the 17 geometry types..."? I thought that just the 7 most common ones are supported in R at the moment. For example, I am not aware of constructor functions (such as 'st_point') for the other geometries according to the 'sf' documentation.
- 2.30 [JM] Page 38, 2nd paragraph - "a raster cell can only hold a single value" – this may be confusing, as multi-band rasters can be thought of as rasters where a a cell holds multiple values.
- 2.31 [JM] Page 51, last paragraph – In a 'raster' object in R, the geometry is separate from values, too. For example, you can have a raster with no values (created from scratch using the 'raster' function), just with the pixels "geometry", same as for a vector layer.
- 2.32 [JN] Page 53, last paragraph – It is true that the "geometry" column does not interfere with sf- related operations, but it can indeed cause problems when working with data.frame-related functions. For example, in case we want to use 'apply' or 'colMeans' to derive some statistics on numeric columns it is not enough to subset those columns, since the geometry column is "sticky" and since it is not numeric will interfere with the applied function. In such case, before using 'apply' or 'colMeans' it is necessary to remove the "geometry" column (e.g. using 'st_set_geometry' with 'NULL').
- 2.33 [RL] Page 74, 2nd and 3rd code sections – 'sparse=FALSE' in 'st_intersects' (and in similar logical operators) leads to the function returning a pairwise 'matrix'. However subsetting using the '[' operator requires a vector. The expressions 'nz_height[sel, ]' and 'nz_height %>% filter(sel)' thus only work as intended in the presented special case, where the second layer ('canterbury') has only one feature, in which case the matrix is correctly coerced to a vector. Though it is noted on the next page, to avoid confusion I would specify that the returned object is a 'matrix' already on page 74, and add another expression explicitly transforming it to a vector, such as 'sel=sel[,1]'
- 2.34 [RL] Page 78, 4th paragraph – "...applies only to the second object" – shouldn't this say "...applies only to the first object"?
- 2.35 [RL] Page 80, 1st code section – As noted above, using 'st_geometry' is more general and thus perhaps preferable to '$geometry'.
- 2.36 [RL] Page 83, 1st code section – A graphical summary of what kind of data should be aggregated with 'extensive=TRUE' or 'extensive=FALSE', and how the resulting layer is calculated, may be useful for quick reference.
- 2.37 [RL] Page 84, 1st code section – One more thing that should be noted is that the (great circle) distance is returned in meters, even though the input data are in geographic projection, thus associated with coordinates in degree units. This is unlike, for example, function 'gDistance' in 'rgeos' which always returns distances in CRS units.
- 2.38 [JM] Page 86, 1st paragraph – "... that both functions also accept objects of class SpatialObjects and sf."
- 'SpatialObjects' is not a class name, perhaps better to say 'Spatial* objects'?
- At time of writing, the 'extract' function has no documented method for 'sf' layers, thus the fact that the expression works may be viewed as a coincidence and should not be hard-coded in the book (see also similar comment above)
- 2.39 [JM] Page 87, 1st code section – 'rmask[rmask == FALSE] = NA' can be replaced with the simpler 'rmask[!rmask] = NA'
- 2.40 [JN] Page 139, 1st paragraph – I believe that the GeoPackage format can store both vector and raster layers.
- 2.41 [RL, JN, JM - to discuss] The topic of "raster-vector interactions" is fundamental in terms of the geo-computation "toolbox". It definitely should not be replaced by a different chapter. Having said that, a chapter on big data is also very important and I would recommend including it if time and book length permit. Such a chapter does not need to be long, since it is less related to R. It may include a general introduction mentioning the current directions (e.g. cloud computing, Google Earth Engine, etc.), followed by some applicative examples – such as an introduction to working with databases in R along with examples of sending queries, say to a to a PostGIS database.
- 3.1 [JM] Page 96, this line " Aligned raster objects share the same header information," seems to refer to certain formats, which do have “a header” (the offset, pixel size, number of pixels, and CRS) but the key point is that alignment here (for raster algebra) means they occupy the same space, they are the exact same shape, in the same coordinate system, with the same alignment (pixel 1 starts at a specific location), the exact same number of pixels, of the exact same dimension. So there’s a one- to-one correspondence between pixels, allowing “map algebra” as calculations between raw numbers. Note that alignment can mean pixels that overlay perfectly, but come from grids of different extents. I think this is an important nuance in the meaning of “alignment”, whereas “shape” is better sense of the overall one-to-one correspondence, and is used (for example) both by NetCDFand by TensorFlow (p 38, section 2.3.4, Chollet and Allaire (2018) Deep Learning with R). (The later section 5.4 does cover the distinction well).
- 3.2 [JN] Page 180, This section on raster reprojection is very good, and includes nice options for non-legacy custom CRS, but it does leave us with the “why reproject a raster to this”? I think it would benefit from a complementary example, when raster data does come in a map projection and we are far better to transform our vector data to that CRS, rather than warp a raster to something else. This is what raster::extract(x, y) does btw, it will transform vector data ‘y’ to the CRS of raster ‘x’ and is for the most part lossless and very much better than the ill-advised but quite often see extract(projectRaster(x, projection(y)), y).
- 3.3 [JN] section 5.3.2 is a good discussion of the need for “point_on_surface”. It could add that there is not just one kind of centroid which sf kind of suggests, sf offers the “weighted centroid” and applies a geodetic modification for longlat data, but not for other CRS. There is also the box centroid (centre of the bounding box), enclosing circle centroid, and other kinds which aren’t available but could be calculated using sf and related tools.
- 3.4 [RL, JN, JM] There doesn’t seem to be any mention of the fact that raster and sf are generally not compatible. Recent versions of raster to provide some interchange, though for example extent() and extract and rasterize do understand sf (but raster() itself does not). That might be worth mentioning, at least as it is an unfinished and un-communicated area of instability.
- 3.5 [RL] section 5.2.1, 5.2.1 The UTM advice needs reworking, but it suffers from some nonsensical (IMO) policies in the sf package. UTM is not a very good thing to advise, it’s not a well-suited projection in many situations, it simply is used a lot by local authorities because it was easy to compute and unfortunately was applied almost universally in pre-computer times. This unfortunately has translated into modern advice on the internet to “use a projection”, and “here’s how to choose the right UTM for your location”. It should be, “choose the right instance of a family of projections with the right properties” (equal area, equi-distant, conformal, some combination of compromises of those), “specify a local parameterization of that family suitable for the region of interest”, “use multiple coordinate systems if no single one suits, or fall back to geodetic calculations for various aspects. The same caveat as given for AEQD applies to UTM, it completely depends on what the application is. “results may not be accurate when used on extensive datasets covering hundreds of kilometers”. I think that general advice here should be:
- use LAEA for a custom local projection (set lon_0 and lat_0 to your centre), it will equal-area at all locations but shape will distort beyond thousands of kilometres
- use AEQD for a specifically super-accurate straight-line distance between a point and the centre point of the local projection
- use LCC for regions covering thousands of kilometres, the cone can be set to keep distance and area properties reasonable between the secant lines
- use STERE for polar regions, but be careful not to rely on area and distance calculations thousands of kilometres from the centre (The secant lines, standard parallels are extra detail that makes things hard for a new user, so I don’t tend to default to a conic projection for exactly this reason). There’s simply no way to concisely couch this advice for any application, it’s better (IMO) to point out the complexity of the problem and the need for understanding that distance, area, and conformality are not constant in any projection. I don’t suggest the book actually go into this kind of detail, but please modify the “UTM is a sensible default” advice, it is bad advice. If we are worried about a custom PROJ.4 string not being compatible with other data, we should be just as worried that this kind of code has absolutely no guarantee of making sense if x and y use different CRS, and no standard package in wide use is of any help here. Manifold’s description of UTM is as good as any other, and more critical than many: http://www.manifold.net/doc/mfd9/universal_transverse_mercator_projection.htm. There’s actually no need to choose a single projection or a single non-projection at all, we can now easily transform between them for different purposes.
- 3.6 [RL] tiny typo, chapter 5, “have are of constant”
- 3.7 [RL] page 95 “lwgeom is also used, but does not need to be loaded” technically should say “does not need to be attached” (this is used throughout the book, and technically “loading” is the minimal case, “attaching” is the full case - pkg::fun will load, library(pkg) will load and attach)
- 3.8 [RL] “interact with a modify”
- 3.9 [RL] In 7.4 you might ad that “as the crow flies” is another parochial synonym for “desire line”, like “bee line”.
- 3.10 [RL] Chapter 7 is a little verbose, but that’s understandable given the domain-specificity.
- 3.11 [RL] 7.7 typo “loosing the geographic attributes.”
- 3.12 [RL] 7.7 typo “This is can be done using the sum_network_routes()”
- 3.13 [RL] This important sentence should be said at the start of Chapter 7: As shown later in section 7.8 … “Bike shops may benefit from new cycling infrastructure, demonstrating an important feature of transport systems: they are closely linked to broader social, economic and land-use patterns.”
- u.1 Last (command) line on page 104: please add a line to plot(world_laea2). The result is much prettier than what is printed in the book.
- u. Besides: when I do (str(con_raster_ea)), then I get different results. [JN: I CANNOT CONFIRM THIS.]
- u.2 The last line on page 111 ends with a colon but there is nothing to follow on the next page. [JN: I WAS UNABLE TO LOCATE IT.]
- u.3 The two code lines on top of page 162 require the lwgeom package. In chapter 5, this was properly noted but here in chapter 7, it is missing. [JN: I AM UNABLE TO LOCATE USE OF LWGEOM IN CH7.]
- f.1 not an error but an inconsistency is the switch from the assignment operator <- to the equal sign from section 2.3 onward - see https://github.com/Robinlovelace/geocompr/commit/bdd0606a7c141ec931c6d20aece3e9e85ee426c7 and https://github.com/Robinlovelace/geocompr/commit/cb4116e99288a0a5e18adaa85a4b34b110b5bc3b
- f.2 Page 131, Figure 5.24 – The figure does not seem to match the code, maybe this is intentional. The following simple code is an alternative where a nice figure is obtained and the code is not complex: 'plot(dem, axes = FALSE); contour(dem, add = TRUE)'
- f.3 Page 44, 3rd paragraph – "Earth’s service" should be "Earth’s surface"?