note: 1. privileged to talk, though will be overshadowed by my predecessor! 2. first visit, would like to join in future talks 3. general introduction of the part of the work might be of some interest to you, open up and focus on discussions on topics 4. learn from you! I have got substantial support from IT already. 5. Best practices comments
note: self taught coder, know enough to know I'm quite rubbish as a coder.- Postgres + PostGIS
- ArcSDE (with
pg_geometry
) note: reasons of this setup: get the best of the two worlds. Esri editing and library, SQL: standard way of storing and manipulating data. Esri way of doing things vs the way people outside GIS do things
- pick the right coordinate system
- overlay analysis
- calculate stats
st_intersects
- macro: feature class or layer
arcpy.intersect_analysis()
- micro: feature
arcpy.da.SearchCursor()
arcpy.Geometry()
- esri spatial analyst
- convert raster to a numpy array ->
numpy
,scipy
,pandas
andmatplotlib
note: expensive, design the analysis in finer grain -> future analysis involves non spatial -> faster
note: current one core only, GIS slow to catch up, massive under utilisation on memory + computing resources
note: easy to debug and keep it simple
mxd = arcpy.mapping.MapDocument(mapdocument)
for each in wdpalist:
exportpng = exportfolder + os.sep + str(each) + '.jpg'
query = '\"wdpaid\" = ' + str(each)
layer_index.definitionQuery = query
# set dataframe coordinate system
sr =arcpy.SpatialReference()
sr_string = GetFieldValueByID_mk2(layer_index,
each, value_field='utm')
sr.loadFromString(sr_string)
df.spatialReference = sr
arcpy.mapping.ExportToJPEG(mxd, exportpng,
"PAGE_LAYOUT", resolution=reso, jpeg_quality=60)
note: map batcher 7000 maps. VBA (buggy), python (re-wrote).
note: flexible library
note: seen the power; cannot agree more; embrace it or risk losing my job
note: For the past two years, I have been putting a lot of thoughts and I keep asking myself what is it that we want to achieve at the end of the day: I want to create a best possible analysis/report, and I want people use them, and influence policy and change behaviour, ultimately to better conserve the planets’ most outstanding places.
note: I have the privilege to have worked on various projects contributing to IUCN's e, m, upstream
the reality a) painstakingly compile a lot of biodiversity observation data, sometimes modelled abstruse data; b) turn these data into even more abstruse scientific analysis and thick papers 3) our audience find it hard to understand and use - > didn't achieve the impact:
a) not good enough work; b) just not enough packaging - making their life difficult?
between generation and uptake of knowledge products
note: a) KL is what I hope will be initiative, the vehicle to carry that thinking and address the problems
b) to bridge the gap or better facilitate between production of knowledge products and their consumption - in order be more impactful.
c) I want a shift of delivering such analytical work to 1) easy to access and understand 2) more public facing, communication oriented
in communicating analytics and digital tools
note:
a) to better reach audience by tapping into opportunities it offers 1. direct and ubiquitous access with internet 2. interactive and engaging 3. dynamic and agile, i.e., quick to modify and improve
b) reports are less appealing, if I, as an author, can’t be bothered to read my own work of 70 pages. I find it powerless to convince other people to read.
c) five minutes attention span. fundamentally there is too much information and too little time - People are busy, swamped by information around them, spoiled by the influx of wealth of information.
for all digital products in the making
note: 1) physical manifestation.
2) single entry point for all products, making it easier to navigate
3) initially all MAVA deliverables under activity 1.2 but could be a common place for any work-in progress digital products in the future
for improvement and future development note: a step at a time. incremental improvement (as opposed to disruptive innovation), needs directions. Platform to gather feedbacks and solicit good ideas.
note: design principles; KL itself but also for each knowledge product in the lab
(and do less!)
note: 1. it is not rocket science, nor even research we do - risk losing out of touch - no need to be complicated and full of jargons.
2. it is about what we do; but also about what we don't do -> concious of resource. a) Don't reinvent wheel. b) Do less but do it very well.
3. concious of limited resources; one thing at a time, set ambitious goals but with achievable low aims
link, provide services, and extend beyond WH
note: with future extendibility in mind.
- intrinsically not in isolation - imperative connection to existing more established KP
- easily extendible to have additional functionality. modern architecture, web services
- WH trailblazing, but equally could be scaled up to other protected areas
Data, methodology and result note: open data, open technology and open accessibility. Source code, analytics reproducible. empower others.
- certainly external pressure to be open, because many are open
- personal view that data etc should not be held back.
- benefits from transparency: allow comments, healthy debate that leads to better ideas, solution and outcome. 2) empower others
any device note:
- maximise accessibility. no matter what terminal device is used. to make it easy for them
- mobile first design.
- Knowledge Laboratory note: link knowledge lab
powered by species climate change vulnerability assessments
note:
- based on the work by Foden 2012, GSP.
- reuse their finding and make it relevant for WH
note:
- the concept: is it sensitive to cc, traits adaptable, will it be exposed
- only when they are high score of all three, are they considered climate change vulnerable
- scores are relative. Thus can't compared across taxa
note:
- infer species within WH using RL
- aggregate all species CCV results within WH
- Are species most vulnerable outside WH or inside
- to what extent do WH provides refugee, high number of ccv species, management responses
- monitoring work. What are the sites that are ccv? in those sites, what are the species that are ccv? What traits leads to their ccv status? What management response could be? delineation based on future extent of such species?
- Brief report
- Reproducible methodology, analysis and findings
- Result for each natural and mixed World Heritage site note: versioned, reproducible, communication oriented. first time data analytics: version controlled, open, accessible, scientific product. New thinking of delivering and communicating knowledge product through means other than lengthy text
GlobeLand30 - 30 meter resolution
note: first time 30 meter global land cover - datasets two time epochs using the same methodology thus change can be estimated
From 2000 to 2010
note: what we did - calculate pixel by pixel change within each WH site
note: validation required. it tells you what but not why - another source of potential threats.
- example, WHO assessment. Could refer to the LCC for any substantial change, if so, this signals an alert, if otherwise unknown, a possible damaging event. forest loss, water body change, amongst others.
- Land Cover change note: first time comprehensive, systematic land class mapping exercise, first time investigated the dynamics of change, first time used the web as a media to deliver findings
note: sense from a distance, on board aircraft and satellite. Valuable, frequent direct observations of features, such as... on the ground from a distance. E.g. look at spetrum of infrared, tell a lot about vegetation.
note: archiving, no longer relevant -> no need to host data in house while it's ubiquitous and easily accessible
note: move away from the original goal, frees time for more interest work
- Time series data, finds the best image (cloud free) in any past given time range. composition of spectra.
- already an idea or on information that is reported, photographic proof if there is any supporting evidence from remote sensing
note: as it stands only visuals - immense opportunity and potential to better visualise but also analyse in the cloud.
- change of NDVI, vegetation index overtime
- give you land cover classification on the fly, dynamically (GEE) - tell you also what they are and how they change
- Landsat 8 for natural World Heritage note: first web service based product. Dynamic in that as long as new data comes in, the maps will be automatically updated. Little or no maintenance cost.
(Prototype)
note: from a data point of view, identify where broad gaps are; if a hypothetical site is to be submitted, how does it compare to existing sites.
for now
note: to replicate desktop system and make it accessible -> enable wider public to undertake a first screening of their intended sites
replicating full functionalities of spatial comparative analysis
note: prototype done, delayed in communication due to UNESCO, full specification done and next step fund raising
- proper spatial analysis
- complete datasets
- improved user experience
- Spatial comparative analysis prototype note: web GIS for the first time, complete system that takes input from the frontend interface, pass onto an underlying GIS database for analysis and then return the result to the web.
- static powered by
python-pelican
- like a blog not a blog
Yichuan Shi
restart