-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update availability of HC-LE data #75
Comments
Hi @p-a-s-c-a-l , The status: We noticed there were about 20 cities with missing layers and we fix the scripts and reload them all in a new DDBB. The DDBB with the new data is already loaded in our server, we will update in the Geoserver today on EOB time. We also created a script which loads all the cities which have the data requirements for been calculated. This step is needed for the following steps. We are preparing another VM to parallelize the process to load the cities faster. We also prepared a new database empty with only the data needed for running all the scripts in differents VM at the same time. AIT offered a VM for running the scripts as well, we put a copy of our VM in the Atos FTP server. Once the VM is up and running we will have 3 different VM loading data at the same time. It will speed up the loading process very much. |
I have asked for the server at AIT. Please:
1) remind me on the specs for the server
2) Explain how to start the scripts once we have a server.
thx
Denis
…On Mon, 13 Jul 2020 at 18:40, DanielRodera ***@***.***> wrote:
Hi @p-a-s-c-a-l <https://github.com/p-a-s-c-a-l> ,
The status:
We noticed there were about 20 cities with missing layers and we fix the
scripts and reload them all in a new DDBB.
The DDBB with the new data is already loaded in our server, we will update
in the Geoserver today on EOB time.
We also created a script which loads all the cities which have the data
requirements for been calculated. This step is needed for the following
steps.
We are preparing another VM to parallelize the process to load the cities
faster. We also prepared a new database empty with only the data needed for
running all the scripts in differents VM at the same time.
AIT offered a VM for running the scripts as well, we put a copy of our VM
in the Atos FTP server. Once the VM is up and running we will have 3
different VM loading data at the same time. It will speed up the loading
process very much.
—
You are receiving this because you were assigned.
Reply to this email directly, view it on GitHub
<#75 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAWTC7TXNNYFCCPGSJ6CUA3R3M2GRANCNFSM4OUSTETQ>
.
|
Hi @DenoBeno, The server requirements are the following: In order to run, you will need to install Xeon virtual management software. As @maesbri said in the mail, would be nice if we could have access to the server to run the scripts and make the proper setup. If it is not possible, I will send you a detailed guide on how to run all the processes by mail. |
Did you update the Geoserver DB @DanielRodera ? This is the current status. It doesn't look different than this one: |
|
Shall I (or @therter) now run the republish script on DEV and check how many cities will be available there? |
Yes, you can start the script, so that we can check, if emikat can do calculations for the new cities. |
Tried to execute the script on DEV, but it fails with error: Script on DEV seems to be the same as the most recent version of it in the Git repo (2 months old). Is there maybe a newer version of the script somewhere or do I need to run anything else before running the republish script? |
This is the newest version. It seems that shp2pgsql is not installed within the postgis docker container since the last image update. I will take a look at it. |
The new cities are now in the dev instance. There were two problems after the last image update: The drupal database was not updated and tried to use the postgis-2.5.so library.Normally, this can be fixed with the following command (see here)
But in this case, the update-postgis.sh script also tried to use the old library, so within the postgis container, a symbolic link to the existing postgis-3.so library had to be created.
The tool shp2pgsql was not installedThe new docker image does not install the postgis client tools. So I had to install the postgis client tools with the following commands:
After this, it was possible to execute the republish script |
@therter When we recreate the container, your customisations are lost. Therefore we have to create a custom |
@DenoBeno Please keep in mind that once the calculation is finished, somebody has to update the respective open data sets we've uploaded to Zenodo. See Publish Hazard Local Effects Input Layers as OpenData. I tried to make an intermediate update of the datasets today, but I was not able to download the SHP files from Furthermore, the Local Effects Datasets GeoServer URIs have to be updated in CKAN. |
We've recently synchronised DEV ->PROD CSIS and updated the list of supported cities from the information retrieved from geoserver.myclimateservice.eu. This is the current situation regarding availability of cities HC-LE input data for screening:
As @DenoBeno mentioned, we've to urgently update the information on data availability for heat wave local effect and pluvial flooding local effect ("cities layer").
Moreover, we don't have to forget to publish Hazard Local Effects Input Layers as OpenData as we've announced in the Data Management Plan. But this can only be done when all layers have been loaded. Can you give an estimate when this is expected to be completed @DanielRodera ?
The text was updated successfully, but these errors were encountered: