-
Notifications
You must be signed in to change notification settings - Fork 2
Sample And Index
In this page we document how to:
- Sample occurrences against layers in biocache (add sampling data to Cassandra).
- Index biocache sampling data in a new biocache SOLR core (to be swapped-in later).
We'll sample and raster data against some added layers.
- Check the configuration for biocache-store is pointing to sampling URL:
- $
/data/biocache/config
- $
more biocache-config.properties
- $
- Look for
spatial.layers.url=http://spatial.l-a.site/ws/fields
- You can do this by connecting with ssh to the livingatlas-demo server you are using and issue command
biocache config | grep -e ".*layers.*url"
- You can do this by connecting with ssh to the livingatlas-demo server you are using and issue command
- Load a DwCA into the collectory. If you are just testing, please choose a small dataset (<50k records) just for speed, preferably Mammals (as this affects a later step of the documentation with taxonomy).
- For IPT users:
- Start here: http://collections.l-a.site/admin/
- Create a data provider and point at IPT instance by setting website URL to IPT URL e.g. https://ipt.gbif.es
- Click “Update data resources” button
- Note: check the unique fields. Typical values are
catalogNumber
oroccurrenceID
. The default iscatalogNumber
- Find a UID e.g. ´dr123´ to load
- For Non-IPT users:
- Start here: http://collections.l-a.site/admin/
- Create a data resource
- Upload your DwCA
- Note: check the unique fields. Typical values are
catalogNumber
oroccurrenceID
. The default iscatalogNumber
- For IPT users:
- Load a DwCA into the biocache using command line tool
- Use the command
biocache load dr123
- Validate the data has been loaded using Cassandra command line tool. Use the tool
cqlsh
on the command line.- Connect to
occ
keyspace usinguse occ;
, - Run
select * from occ;
.
- Connect to
- Use the command
- Process the data resource - Use the command
biocache process -dr dr123
- Sampling - Use the command
biocache sample -dr dr123
- Indexing - Use the command
biocache index -dr dr123
- Test the indexing was successful by:
- Viewing the SOLR admin console: http://index.l-a.site:8983 See solr admin interface to tips to access this.
- View the results in biocache services
- Test with an Area Report in the Spatial Portal
- Search for Gazetteer Polygon e.g. “Queensland”
- Tools > Area Report - and follow wizard Successful sampling/indexing depends upon specific details being properly applied in sequence. It is easy to miss or bungle a step, and it can be hard to tell which one caused a problem. To avoid those headaches, testing the outcome of each step can help to troubleshoot. Also, some background information about system configuration provides an overview of the process, and will hopefully help to debug issues as they arise. To that end, a synopsis:
-
Sampling takes each occurrence having geospatial data (lat, lng) in biocache (Cassandra) and references it against each properly-configured layer. The outcome of successful sampling for a single occurrence is a dependent value in biocache (Cassandra) for the column
cl_p
in the tableocc
. The value ofcl_p
will look something like this (from a Cassandra query):cl_p | {"cl100001":"England", "cl100002":"North Yorkshire", "cl100003":"Beast Cliff", "cl100004":"OV 0000"}
where json keys like "cl10001" are layers' field IDs which you identified when you configured layers in the spatial portal. View field IDs directly with eg.
https://spatial.l-a.site/ws/manageLayers/field/cl10001
-
Indexing (of biocache) does more than just the indexing of sampled layers. But for the purposes of this discussion of sampled layers, indexing duplicates biocache (Cassandra) sampling data in a biocache (Solr) index. The outcome of successful indexing (of biocache) for sampled layers will look something like this (from a Solr query):
..., "cl10001":"England", "cl10001":"North Yorkshire", "cl10003":"Beast Cliff", "cl10004":"OV 0000", ...,
Index
- Wiki home
- Community
- Getting Started
- Support
- Portals in production
- ALA modules
- Demonstration portal
- Data management in ALA Architecture
- DataHub
- Customization
- Internationalization (i18n)
- Administration system
- Contribution to main project
- Study case