-
Notifications
You must be signed in to change notification settings - Fork 5
/
README
1503 lines (1193 loc) · 74.2 KB
/
README
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
********************************************************************************
* Copyright Notice
* This code is (C) 2006-2024
* Vincent E. Larson and Brian M. Griffin
*
* The distribution of this code and derived works thereof
* should include this notice.
*
* Portions of this code derived from other sources (Hugh Morrison,
* ACM TOMS, Numerical Recipes, et cetera) are the intellectual
* property of their respective authors as noted and are also
* subject to copyright.
********************************************************************************
********************************************************************************
* Overview of the Cloud Layers Unified By Binormals (CLUBB) code
********************************************************************************
For a detailed description of the model code see:
Larson, V. E., 2017: CLUBB-SILHS: A parameterization of subgrid variability
in the atmosphere. https://arxiv.org/pdf/1711.03675.pdf.
Golaz, J.-C., V. E. Larson, and W. R. Cotton, 2002: A PDF-Based Model for
Boundary Layer Clouds. Part I: Method and Model Description. J. Atmos. Sci.,
59, 3540--3551.
Golaz, J.-C., V. E. Larson, and W. R. Cotton, 2002: A PDF-Based Model for
Boundary Layer Clouds. Part II: Model Results. J. Atmos. Sci, 59, 3552--3571.
Larson, V. E. and J.-C. Golaz, 2005: Using Probability Density Functions to
Derive Consistent Closure Relationships among Higher-Order Moments. Mon. Wea.
Rev., 133, 1023--1042.
See also the ./doc/CLUBBeqns.pdf file in the git repository for the CLUBB
model equations and finer details on how the discretization was done.
The single column model executable ("clubb_standalone") runs a particular
case (e.g. BOMEX shallow nonprecipitating cumulus case)
and outputs statistical data in either Network Common Data Form (netCDF)
format or Grid Analysis and Display System (GrADS) format
(if the --grads flag is used with the run script run_scm.bash).
GrADS is both a data file format and a plotting program. See
<http://www.iges.org/grads/> for a description of GrADS. NetCDF is a self
describing data file format that can be read by the GrADS plotting program,
MATLAB, and other scientific programs. See
<http://www.unidata.ucar.edu/software/netcdf/> for a description of netCDF.
A list of software packages that can be used to display netCDF data can be
found at <http://www.unidata.ucar.edu/software/netcdf/software.html>.
The tuner code tunes certain parameters in a one-dimensional boundary layer
cloud parameterization (``CLUBB''), to best fit large-eddy simulation output.
It is not needed to run the single-column or for using CLUBB in a host model.
The parameterization is called as a subroutine ( run_clubb() ) with
parameter values as input.
The tuner code is highly flexible. One can vary the cases (bomex, fire, arm,
or atex) to match; the variables to match (cloud fraction, liquid water, third
moment of vertical velocity, etc.); the altitude and times over which to match
these variables; and the parameters to tune (C1, beta, etc.).
For more information about the tuner, see section 3.3 below and in input_misc/tuner/README.
***********************************************************************
* CLUBB Quick Start Guide *
***********************************************************************
CLUBB is a single-column atmospheric model written in ISO Fortran 2003, and
executed using scripts written in the GNU Bash scripting language.
A script to plot CLUBB output, pyplotgen, has been implemented in Python,
and a script to compile CLUBB, mkmf, has been implemented in Perl.
This quick start guide has instructions for checking CLUBB out from the UWM
repositories, compiling CLUBB using the gfortran compiler, running one of the
benchmark test cases, and creating plots.
-------------------------------------------------------
- STEP 1: OBTAINING THE CODE FROM THE REPOSITORY
-------------------------------------------------------
Checking CLUBB out requires the user to be a collaborator on the clubb_release github repository.
Sign up will require a github username.
After entering the required information in the registration page, an invitation e-mail
will be sent to the email associated with the github account entered.
Please note that each user will only need to sign up once.
With the valid permissions as a collaborator, the CLUBB code can be cloned from its
repository using git. If you have not already downloaded CLUBB,
then at a bash prompt, type in the following command from the directory where
you want the clubb folder to be placed, all on one line:
user@computer ~$ git clone http://github.com/larson-group/clubb_release.git
After entering this command, git will prompt for a username and password.
Use your github credentials. After the authorized credentials are entered,
git will download the CLUBB source code into the directory specified as the
last option in the git command; in the example
above, it will be downloaded to the clubb/ directory. On some computers,
when entering the password, the cursor may remain in the same spot
and the screen may remain blank. Don't be confused; hit return, and
your password will be entered.
-------------------------------------------------------
- STEP 2: COMPILING THE CLUBB SOURCE CODE
-------------------------------------------------------
Before CLUBB can be run, the source code must first be compiled. CLUBB is known
to compile properly with various Fortran compilers, including gfortran,
which is free. The CLUBB code relies on a script called mkmf
which requires your system to have Perl installed.
To compile, change directories to the CLUBB directory (e.g. "clubb"),
and then to the compile directory within the CLUBB directory:
user@computer ~$ cd clubb/compile
Once in the compile directory, run the compile.bash script, using the config
file corresponding to the target platform to run CLUBB on.
For instance, on a Linux machine, to compile with gfortran, use the following
command:
user@computer compile$ ./compile.bash -c config/linux_x86_64_gfortran.bash
To run this on a Mac OS X machine, you should use the following command:
user@computer compile$ ./compile.bash -c config/macosx_x86_64_gfortran.bash
Note: If clubb did not compile correctly (e.g. compiled with wrong config)
and you are using a mac, try installing gnu-getopt as opposed to bsd-getopt (default on mac) - compile.bash expects gnu-getopt.
Another work around for this is to change the default compile configuration in compile.bash.
To do this comment out the config for linux gfortran and uncomment the config for macOS gfortran in compile.bash
The code should look like this:
# Set using the default config flags
# CONFIG=./config/linux_x86_64_gfortran.bash # Linux (Redhat Enterprise 5 / GNU)
# CONFIG=./config/linux_x86_64_g95_optimize.bash # Linux (Redhat Enterprise 5 g95)
CONFIG=./config/macosx_x86_64_gfortran.bash # MacOS X / GNU
# CONFIG=./config/aix_powerpc_xlf90_bluefire.bash # IBM AIX on Bluefire / XL Fortran
# CONFIG=./config/solaris_generic_oracle.bash # Oracle/Sun Solaris / Oracle/Sun Fortran
(If you want a primer on using the command line shell on Macs, see
https://developer.apple.com/library/mac/documentation/OpenSource/Conceptual/ShellScripting/CommandLInePrimer/CommandLine.html .)
The command line will output information relating to the compilation of CLUBB.
If you change something in the configuration and need to remove the old
executables and object files you can type:
user@computer compile$ ./clean_all.bash
Then recompile the code.
-------------------------------------------------------
- STEP 3: RUNNING A BENCHMARK TEST CASE
-------------------------------------------------------
Once CLUBB is compiled, the various benchmark test cases can be run.
To do this, change directories to the run_scripts directory inside the CLUBB
checkout.
Once in the run_scripts directory, CLUBB is generally run using the
run_scm.bash script. This script will run the CLUBB model for a single case,
which can be specified on the command line. For example, to run the BOMEX case:
user@computer run_scripts$ ./run_scm.bash bomex
To list various run-time options --- such as settings for grid spacing,
time step, stats output, and grads format --- type
user@computer run_scripts$ ./run_scm.bash --help
Other cases that you may run are listed in run_scripts/RUN_CASES. The command
line will output information relating to the case as it runs. In addition,
output from the case will be placed in the output directory inside the CLUBB
checkout.
-------------------------------------------------------
- STEP 4: VIEWING THE RESULTS WITH PYPLOTGEN
-------------------------------------------------------
After a case has been run, the resulting output in netCDF format can be used
to create graphs. One way is to use an included python3 script called pyplotgen.py.
Pyplotgen produces plots that allow CLUBB's output to be easily visualized.
Other options for plotting the output can be found at
http://www.unidata.ucar.edu/software/netcdf/software.html.
**************************************************************************
NOTE:
Required packages to run pyplotgen --- e.g. pillow, netcdf4, and matplotlib ---
are contained in postprocessing/pyplotgen/requirements.txt.
**************************************************************************
The pyplotgen.py script is located in the postprocessing/pyplotgen directory
inside the CLUBB checkout. Pyplotgen is run by running the pyplotgen.py script,
after telling it where the CLUBB output directory is and specifying the desired
plots output directory. pyplotgen.py can be called from the postprocessing/pyplotgen
directory inside the CLUBB checkout. For detailed instructions and options,
please read postprocessing/pyplotgen/README.md in your CLUBB checkout.
Then modify the ALL_CASES variable in pyplotgen/config/Case_definitions.py to
include the CLUBB case(s) that you'd to plot (and only those cases).
To generate a webpage containing plots, type (if your output
is in ~/clubb/output and you want the plots to go in directory ~/plots):
user@computer$ cd ~/clubb/postprocessing/pyplotgen
user@computer$ python3 ./pyplotgen.py -c ~/clubb/output -o ~/plots
(Here the tilde, ~, tells the OS to look in the user's home directory.)
The above line will use pyplotgen to create plots from the simulation output
in ~/clubb/output, and place the resulting plots in the ~/plots directory.
pyplotgen will create the ~/plots directory if it does not exist,
and if it does exist, instead use
user@computer$ python3 ./pyplotgen.py -r -c ~/clubb/output -o ~/plots
in order to overwrite what's in the ~\plots directory.
If you want to overplot two simulations, move each set of output files to
its own directory, e.g. ~/clubb/output/sim1 and ~/clubb/output/sim2, and
type the command
user@computer$ cd ~/clubb/postprocessing/pyplotgen
user@computer pyplotgen$ python3 ./pyplotgen.py -c ~/clubb/output/sim1 ~/clubb/output/sim2 -o ~/plots
To view the plots, use a web browser to view index.html in the plots directory.
user@computer ~$ firefox ~/plots/index.html
Alternatively, one can use the command "pwd" in order to find the directory path
to the index.html file. Then go to a web browser URL bar, type "file://" then the path,
and then "/index.html".
-------------------------------------------------------
- SUMMARY FOR LINUX
-------------------------------------------------------
To check out, compile, and run CLUBB, and use pyplotgen to view the results,
follow the following steps:
- Become a CLUBB collaborator
- Sign up at: https://carson.math.uwm.edu/larson-group/clubb_site/signup
- This only needs to be performed once.
- You must have a valid GitHub account.
- Check out the CLUBB code with git
- git clone https://github.com/larson-group/clubb_release.git
- Compile the CLUBB source code with a Fortran compiler (e.g. gfortran)
- ./compile.bash config/linux_x86_64_gfortran.bash
(in the clubb/compile directory)
(Replace config/linux_x86_64_gfortran.bash to match the desired platform
and compiler. You may need to edit these for your pariticular computer
setup if your paths are not the same as ours)
- Run a benchmark case (e.g. BOMEX)
- ./run_scm.bash bomex
(in the clubb/run_scripts directory)
- Run pyplotgen to create plots of the output from the benchmark case simulation
- python3 ./pyplotgen.py -c ~/clubb/output -o ~/plots
(in the clubb/postprocessing/pyplotgen directory, assuming CLUBB is checked
out to ~/clubb)
- View the pyplotgen plots using a web browser
- firefox ~/plots/index.html
***********************************************************************
* Using the CLUBB Model *
***********************************************************************
%%%%%%%%%%%%%%%%%%%%%%%%%%
%
% CHAPTER 1: COMPILING
%
%%%%%%%%%%%%%%%%%%%%%%%%%%
-----------------------------------------------------------------------
- (1.1) Building (i.e. compiling) everything:
-----------------------------------------------------------------------
The CLUBB code is written in ISO Fortran 2003 and executed by scripts written in
the GNU Bash scripting language.
The mkmf Makefile generating script and some other optional code checking
scripts are written in the Perl scripting language.
On the Microsoft Windows platform the CLUBB parameterization could be configured
and compiled using MSYS or Cygwin with G95, but we have not tested this sort
of configuration.
When compiling CLUBB to a new platform there are sometimes portability
difficulties that arise, usually from different Fortran compilers. When
possible it may be helpful to look at similar platforms in the compile/config
directory. Another useful troubleshooting technique is to examine the
configuration of other Fortran projects (e.g. WRF) on the same platform. If you
manage to compile CLUBB on a new configuration and would like to share it with
other users, feel free to send us an email with your configuration attached.
We mainly use the GNU Fortran compiler on Intel and AMD x64 processors running
CentOS 6.5. GNU Fortran <http://gcc.gnu.org/fortran/> has been tested
on x64 & x86 GNU/Linux and MacOS X systems. Other configurations that have been
used to compile CLUBB can be found in the compile/ config directory.
Older versions of the GNU Fortran compiler (GCC) may not work. The default
version on Redhat Enterprise 5 does not. Install and use gfortran44 instead.
The versions that comes with Fedora Core 11 and Ubuntu 8 LTS and later should
be able to compile CLUBB properly.
It is important to note that all these compilers use *incompatible* module
formats for the .mod files! If you want to use different compilers on the
same system, you will need to build a different set of netCDF mod files for
each compiler and use -M or -I flags to specify their location.
In order to get similar results on differing architectures, platforms, and
compilers, initially try a conservative optimization and enable
IEEE-754 standard style floating-point math. On x86 compatible computers
enabling SSE or SSE2 with a compiler flag is usually the best way to do this.
Requirements for compiling and running CLUBB:
A. A Fortran 2003 compiler with a complete implementation of the standard.
The compile/config directory contains several scripts for
configurations we have tested.
B. GNU make (we use v3.81).
C. Perl 5 to run the mkmf script (called by compile.bash to create the makefiles).
D. LAPACK & BLAS. These provide the tri and band diagonal matrix solver
subroutines needed by CLUBB. Many vendors provide optimized versions of
these routines, which are typically much faster than the reference BLAS.
To be safe, we recommend you use a LAPACK and BLAS library compiled with the
same compiler you compile the CLUBB code with (e.g. don't compile LAPACK
with GNU Fortran and CLUBB with Portland Group's Fortran).
E. GNU bash, or an equivalent POSIX compliant shell to use the run scripts.
See <http://http://www.gnu.org/software/bash/>.
Optionally:
F. GrADS for viewing the GrADS output data.
G. netCDF >= v3.5.1; We have not tested our code with anything older.
If you do not use netCDF, you must remove -DNETCDF from the preprocessor
flags, found in the compile/config/<PLATFORM>.bash file, and
remove -lnetcdf from the LDFLAGS.
H. MATLAB, GNU Octave or NCAR graphics for viewing the netCDF output data.
To compile the code, perform the following three steps:
1. $ cd <CLUBB BASE DIRECTORY>/compile
(<CLUBB BASE DIRECTORY> is the directory to which you checked out CLUBB.
Usually it is called "clubb" or some variant.)
2. Edit a ./config/<PLATFORM>.bash file and uncomment the corresponding
line in the file compile.bash. Depending on your platform you may need
to create a new file based on the existing configurations, and add a new
line to compile.bash. Add or uncomment the "source" statement
for your <PLATFORM>.bash in the file ./compile.bash, and comment
out the other "source" statements with a # character.
Alternatively, compile.bash can take in an argument specifying the
config file to use. In this case, the ./config/<PLATFORM>.bash file
would still need to be edited/created according to your platform, but
./compile.bash would not need to be edited at all.
Note that the variables libdir and bindir determine where
your executables and libraries will end up, so make sure you set it
to the correct location (the default is one directory up).
3. $ ./compile.bash
if you edited compile.bash, or
$ ./compile.bash ./config/<PLATFORM>.bash
if you wish to pass in the config file as a parameter
*** Important Note ***
If you have a linking error related to a symbol with a name like _sisnan or
_disnan you have an older version of LAPACK. You will either need to
upgrade LAPACK, or add -DNO_LAPACK_ISNAN to the CPPDEFS variable in your
./config/<PLATFORM>.bash; note however that we haven't tested this
alternate code with many compilers using high levels of optimization. Certain
assumptions made by high levels of optimization may disable checks in CLUBB
that test for NaN/Undefined variables after solving a matrix if the flag
-DNO_LAPACK_ISNAN is added.
The executables and Makefile will appear in <CLUBB BASE DIRECTORY>/bin
and libraries in <CLUBB BASE DIRECTORY>/lib. The object (.o) and
module (.mod) files will appear in <CLUBB BASE DIRECTORY>/obj.
If you're using GNU make and have a fast parallel machine, parallel builds
should work as well. E.g. for 3 threads, append gmake="gmake -j 3" to the
file source'd from compile.bash.
The mkmf script may or may not generate files that are compatible with
non GNU versions of make.
If you add a new source file to CLUBB, then in order for mkmf to be
able to find it, you will need to add the filename
to one of the file lists in directory compile/file_list.
When compiling for a tuning run, make sure to include the command-line option
-t/--tuner along with any other option:
$ ./compile.bash -t [other options]
-----------------------------------------------------------------------
- (1.1.1) Promoting real to double precision at compile time
-----------------------------------------------------------------------
Several compilers allows for promotion of real variables without a "kind="
statement to double precision at compile time through the use of a compiler
flag. This will not currently work with the numerical recipes subroutines
referenced by the tuner because they use operator overloading (therefore you can
only use tune_type = 2 when all real variables are promoted).
It also fails to work with Morrison microphysics (clubb:ticket:585#comment:13).
To do this:
1. Do a clean_all.bash if have object files and programs from a prior compile.
2. Edit compile.bash so that l_double_precision=true.
3. ./compile.bash
To set the precision of CLUBB's internal variables, in the
compile/config .bash file you wish to use, set -DCLUBB_REAL_TYPE=8
for double precision or -DCLUBB_REAL_TYPE=4 for single precision.
-----------------------------------------------------------------------
- (1.2) Building (i.e. compiling) for use in a host model:
-----------------------------------------------------------------------
You do not need to build all the components if you have implemented CLUBB
in a large-scale weather or climate model and want to run the combined
model, rather than running CLUBB in standalone (single-column) mode
as described above.
Requirements:
A., B., C., D., & E. as above.
Build:
There are two basic ways to doing this:
-----------------------------------------------------------------------
- Method 1: Build libclubb_param.a and link it to the host model
-----------------------------------------------------------------------
Do 1, 2, & 3 as above. Important Note: The host model, CLUBB, and ancillary
programs such as netCDF and MPI need to be compiled using the same version
of Fortran and with the same compiler flags. Not using the same compiler and
the same flags may cause errors and/or spurious results.
Optionally, you can safely remove everything but libclubb_param.a from the
"all" section of the compile.bash script if you only want to use CLUBB in
a host model.
Then, do
$ ./compile.bash
This will build just the static library and the f90 modules.
The static library will be in <CLUBB BASE DIRECTORY>/lib, while the modules will be
in the <CLUBB BASE DIRECTORY>/obj directory. You will need at least the
clubb_core.mod and constants.mod file to interface with CLUBB.
Addition by Brian:
In addition to the above, you will have to make a reference to the CLUBB
library from the configuration file of the host program. Since CLUBB now uses
the LAPACK libraries, you will also have to make reference to those. Currently,
we do not include the LAPACK libraries with the CLUBB download. You will have
to find them and then download them onto your own computer if they are not
included with your operating system or compiler. Once you have done this, you
can reference them in a line such as the following:
-L/home/griffinb/clubb/lib -lclubb_param -llapack -lblas
If the LAPACK and BLAS libraries were compiled with GNU Fortran, you may
need to link to the runtime libs for that with -lg2c or -lgfortran as well.
Don't forget that you will also have to make reference
to the CLUBB modules. You can reference that with a line
such as the following:
-I/home/griffinb/clubb/obj
-----------------------------------------------------------------------
- Method 2: Use a host model's make to compile CLUBB
-----------------------------------------------------------------------
We've used git subtrees to put the CLUBB_core directory into the SAM host
model, and compile CLUBB using the SAM's Build script. Other host models
that utilize B. Eaton's mkDepends and mkSrcFiles could be similarly
configured. The basic method is to put the src/CLUBB_core directory into
the search path used by mkSrcfiles and mkDepends. These will create the
list of files to be used and then CLUBB should compile without a problem.
This tends to be the less complicated solution and allows you to make
changes to the CLUBB parameterization with just one compile step.
There are 3 key caveats when this method however:
1. The netCDF library still needs to be linked in if -DNETCDF is defined
for compiling CLUBB's source files. If the host model itself uses the
netCDF library, this shouldn't require any modifications to the linking
(this is usually in the LDFLAGS variable for the host model).
2. The LAPACK and BLAS libraries still need to be linked into the host
model application.
3. The CLUBB model has files with a .F90 extension in addition to .f90;
You may need to modify mkSrcfiles if it's not searching for those.
-----------------------------------------------------------------------
- (1.2.1) Performance in a host model:
-----------------------------------------------------------------------
There are several key points to reducing the portion of runtime spent
by CLUBB in a host model. These include:
1. Using a fast compiler with flags that work well for your computer.
On UWM's machines, the Intel Fortran and Sun Studio compilers performed
much better than g95. Consult the documentation provided by the compiler
vendor, and try to use options that optimize to your particular processor
and cache. Generally, we recommend against using options that reduce
the precision of calculations, since they may negatively impact the
accuracy of your results.
2. Choose a fast implementation of the Basic Linear Algebra Subroutines
(abbreviated BLAS). This is more crucial for CLUBB than for other models
because CLUBB uses large matrix inversions.
For the LAPACK and BLAS libraries it is best to use the AMD Core Math Library,
Intel Math Kernel Library, or ATLAS BLAS rather than the "reference" BLAS
that typically comes with GNU/Linux systems, because the former will greatly
improve the CLUBB code's runtime.
Typically we've seen better performance with the AMD Core Math Library
using AMD processors and the Intel MKL using Intel processors.
The ATLAS version of BLAS has been carefully configured to work well with
both after being tuned at compile time to your specific setup.
It is probably best to avoid the current Oracle Performance Library for
production simulations, since it appears to have a high OpenMP overhead
when solving the matrices in CLUBB.
On IBM AIX the included library ESSL lacks the band diagonal solvers that
CLUBB uses, so regular LAPACK must be used.
3. If the host model's timestep is less than a minute, then a time-saving option
is to sub-cycle the CLUBB code so that CLUBB is being called at a 60 or 120
second timestep rather than the host model timestep. The CLUBB code
uses a semi-implicit discretization, and should not require a small timestep.
4. In some implementations of CLUBB in a host model, we have enabled the CLUBB
statistics code for a particular horizontal grid column (e.g. output the
first column of the third row on the domain). This is meant to be used
for diagnostic purposes and should be disabled (l_stats = .false.) when
the model is used for production runs. Alternatively, a small number of
CLUBB's variables could be output rather than all of them. That way, not as
many variables would need to be written to the disk.
5. Host models should call set_clubb_debug_level at initialization. For
production simulations, use an argument of 0 rather than 1 or 2. This
disables warning messages and associated diagnostics, and will help speed
up the model.
6. When determining runtime occupied by CLUBB, keep in mind that the
percent time spent in the CLUBB code will be proportional to other processes
that are computed within the host model. For example, in SAM-CLUBB the
percentage of runtime spent in the CLUBB code will be far less if there
large number of microphysical fields or tracers since the host model will
need to advect, diffuse, and apply other processes to each of them. The
total time in CLUBB should be the same regardless of these other processes.
-----------------------------------------------------------------------
- (1.3) Making clean (for re-compiling from scratch)
-----------------------------------------------------------------------
Occasionally, one needs to erase old executables or libraries and re-compile
the code starting with nothing. For instance, this may be required when
a library or compiler is updated.
To delete old object files (*.o), and mod (*.mod) files,
go to <CLUBB BASE DIRECTORY>/bin (where Makefile resides) and type
$ make clean
If this doesn't help, then to additionally delete everything in the binary
and library directories, go to <CLUBB BASE DIRECTORY>/bin and type
$ make distclean
To save time we have a script clean_all.bash in the compile directory that will
also do make distclean without the need to change directories.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%
% CHAPTER 2: EXECUTING BASIC SIMULATIONS
%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
-----------------------------------------------------------------------
- (2.1) Executing a single-column (standalone) run:
-----------------------------------------------------------------------
Before you can execute CLUBB, you must compile it (see Build section
above.)
<CLUBB BASE DIRECTORY> refers the the directory where clubb is installed.
<CASE NAME> refers to the cloud case, e.g. arm, atex, bomex, etc.
1. cd <CLUBB BASE DIRECTORY>/input/case_setups
2. Edit <CASE NAME>_model.in for each case you wish to run, or just leave
them as is. This file contains inputs such as model timestep, vertical
grid spacing, options for microphysics and radiation schemes, and so forth.
See the KK_microphys code for a description of Khairoutdinov and Kogan
drizzle parameterization.
See BUGSrad description below for a description of the interactive
radiation scheme.
Enabling radiation or microphysics parameterizations may increase runtime
considerably.
3. cd <CLUBB BASE DIRECTORY>/input/stats
Edit a stats file you would like to use, if you would like to output to
disk a variable that is not currently output. A complete list of all
computable statistics is found in all_stats.in. Note that CLUBB now
supports GrADS or netCDF output, but you can only use the clubb_tuner
using GrADS, due to some issues with buffered I/O.
4. $ cd <CLUBB BASE DIRECTORY>/input/tunable_parameters
Edit tunable_parameters.in if you are an expert and wish to try to optimize
the solution accuracy. The file tunable_parameters.in contains values of the
physical parameters that are embedded in parameterized terms in CLUBB's
equation set. They are neither constants of nature, nor are they configuration
specifications such as paths to input files. They are specified constants
found in the model equation set (for example, C5, C8, etc.; see the CLUBB
model equation set in ./doc/CLUBBeqns.pdf for more detail). The default
parameter values in tunable_parameters.in have been tested extensively and
will work with all the current cases.
5. Within folder input/tunable_parameters, edit configurable_model_flags.in
if you are an expert and you want to try different options within CLUBB.
6. $ cd <CLUBB BASE DIRECTORY>/run_scripts
$ ./run_scm.bash <CASE NAME> or
$ ./run_scm.bash <CASE NAME> -p <PARAMETER FILE> -s <STATS FILE>
Where the parameter file and stats file are optional arguments. The default
is standard_stats.in and tunable_parameters.in.
For a complete list of options type:
$ ./run_scm.bash --help
The resulting data will be written in the directory clubb/output.
-----------------------------------------------------------------------
- (2.2) Explanation of CLUBB's output and input files
-----------------------------------------------------------------------
Nota bene: Our numerical output is usually in GrADS format
(http://www.iges.org/grads/). Each output has a header, or control file
(.ctl), and a data file (.dat). The .ctl file is a text file that
describes the file format, which variables are output in which order, etc.
CLUBB also can output in netCDF (.nc) format.
Output:
------
Generated CLUBB GrADS files (in clubb/output):
bomex_zt.dat, fire_zt.dat, arm_zt.dat, atex_zt.dat, dycoms_zt.dat,
wangara_zt.dat, <case>_zm, <case>_sfc ...
These are the output files generated by CLUBB. Every time CLUBB is run,
these are overwritten, so if you want to prevent them
from being erased be sure to either copy the .ctl and .dat
files to another directory or rename them.
LES GrADS files (available only to larson group members):
les_data/bomex_coamps_sw.ctl, les_data/wangara_rams.ctl
FIRE, BOMEX, ARM & ATEX are some basic benchmark ``datasets'',
simulated by COAMPS, that we compare to CLUBB output. BOMEX is trade-wind
cumulus; FIRE is marine stratocumulus; ARM is continental cumulus; and
ATEX is cumulus under stratocumulus. BOMEX, FIRE, and ATEX are statistically
steady-state; ARM varies over the course of a day.
Input:
-----
Input values of parameters and flags can be set in various namelists, described below.
NOTE: DEFAULT VALUES OF PARAMETERS AND FLAGS SET IN SOURCE CODE MAY BE OVERWRITTEN
BY NAMELIST VALUES!!!
Note that at the beginning of a run the run scripts combine all the namelist
files into one file, named clubb.in, which is then fed into the CLUBB main code.
The namelist files:
input/case_setups/<CASE NAME>_model.in
These files specify the standard CLUBB model parameters. Usually these
do not need to be modified unless a new case is being set up.
input/stats/all_stats.in, nobudgets_stats.in, etc.
These files specify statistics output for each simulation. See
all_stats.in for a complete list of the all variables that can be output.
The surface files:
input/case_setups/<CASE NAME>_surface.in
These files contain the time-dependent surface fluxes and conditions for a case.
The variables that can be included here are Time[s], thlm[k], rt[kg/kg],
latent_ht[W/m^2], sens_ht[W/m^2], CO2[umol/m^2/s], upwp_sfc[(m/s)^2],
vpwp_sfc[(m/s)^2], T_sfc[K], wpthlp_sfc[mK/s], and wpqtp_sfc[(kg/kg)m/s].
A surface.in file must contain the Time variable, but all others are
optional.
Not all cases have a _surface.in file. Some cases compute surface
variables from an equation provided in the case specification. For
these cases, the surface variables are computed within
src/Benchmark_cases/<CASE NAME>.F90, subroutine <CASE_NAME>_sfclyr().
The forcings files:
input/case_setups/<CASE NAME>_forcings.in
These files contain the time-dependent forcing data for a case.
Not all cases have a _forcings.in file. Some case compute forcings
variables from an equation provided in the case specification. For
these cases, the forcings variables are computed within
src/Benchmark_cases/<CASE NAME>.F90, subroutine <CASE NAME>_tndcy().
The sounding files:
input/case_setups/<CASE NAME>_sounding.in, <CASE NAME>_sclr_sounding.in,
<CASE NAME>_edsclr_sounding.in, <CASE NAME>_ozone_sounding.in
These files contain the sounding data by level for a case.
All cases need a _surface.in file. _sclr_sounding.in, _edsclr_sounding.in,
and _ozone_sounding.in files are optional.
Some of best tested cases:
arm
atex
bomex
dycoms2_rf01
dycoms2_rf02_do
dycoms2_rf02_ds
dycoms2_rf02_nd
dycoms2_rf02_so
fire
gabls2
gabls3
mpace_a
mpace_b
rico
wangara
How to create a new case:
The easiest way to create a new case is to copy and then modify one of the
"current" cases shown above. The typical files that need to modified are the
files mentioned above: namelist file, surface file, forcing file, and
sounding file, as well as the <CASE NAME>.F90 file located in
src/Benchmark_cases.
The randomization files (only needed for the tuner described below):
run_scripts/generate_seed.bash, input_misc/tuner/rand_seed.dat, bin/int2txt
The script uses intrinsic functionality in the Linux kernel to generate
a pseudo random seed (the .dat) used by the tuner for randomizing initial
parameters. This works on any operating system with a Linux style
/dev/urandom (Solaris, Tru64, etc.) as well. The seed file is now plain text
text and can be edited by hand.
-----------------------------------------------------------------------
- (2.3) Plotting output from a single-column run:
-----------------------------------------------------------------------
A plotting script, pyplotgen.py, is contained in the directory postprocessing/pyplotgen.
See postprocessing/pyplotgen/README.md for more information on pyplotgen.py.
Otherwise, you can view the raw CLUBB output files in GrADS or netCDF
format using a plotting program such as GrADS (http://www.iges.org/grads/).
To open a GrADS session, type "grads" at a Linux prompt.
--------------------------------------------------------------------------
- (2.4) Determining whether two CLUBB simulations produce different output:
--------------------------------------------------------------------------
There are 2 major ways to determine if output files are different, each will
be covered briefly in this section.
1) run_bindiff_all.py script: tests for bit-by-bit equivalence
2) plotting the outputs using pyplotgen: tests for visual agreement
- Comparison using run_bindiff_all.py
This script takes two output directories as arguments and will compare the
contents of those directories to determine if the binary outputs differ.
Note that
run_scm_all.bash must be run prior to this script to create the ouput to be
compared. The script will compare the .nc files for each
case specified in run_scm_all.bash. This is the most exact way to detect
differences between two output directories.
This script can be found in "clubb/run_scripts".
Usage: ./run_bindiff_all.py output_directory output_directory_to_compare
- Comparison using pyplotgen
This method requires python as the plots are generated through python.
For information on pyplotgen, including detailed usage and options see:
postprocessing/pyplotgen/README.md in your CLUBB checkout.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%
% CHAPTER 3: FANCIER TYPES OF SIMULATIONS
%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
-----------------------------------------------------------------------
- (3.0) Multicolumn Runs:
-----------------------------------------------------------------------
CLUBB has the capability to run with multiple columns, where each
columns uses a different parameter set. To perform one of these runs
you need a file containing a list of the parameters you want to use
for each colummn, along with an additional namelist containing the
number of columns to run with (ngrdcol) and a logical (l_output_multi_col)
that tells CLUBB whether or not you want to multiple column data to be saved
to disk.
There is a tool in run_scripts used to generate a multiple column
parameter set along with this extra namelist, here is an example use:
`create_multi_col_params.py -n 32 -param_file input/tunable_parameters/tunable_parameters.in`
- This will create a new param file with the default name "clubb_params_multi_col.in"
- This file will contain 32 copies of the parameter set passed into it
- It will also contain the namelist setting ngrdcol=32 and l_output_multi_col=.true. (true by default)
- There are other options this script takes, including different parameter generation schemes,
see the script for more details.
Once you have the multicolumn parameter file, you can run clubb with the '-p'
flag to pass in the new parameter set, like so:
`./run_scm.bash -p clubb_params_multi_col.in arm`
- The multiple column data will be saved to output/{casename}_multi_col_zm/zt.nc
- This data is output if `l_output_multi_col` is true, regardless of the
l_stats setting, so one can run clubb with "-e" as well to turn off
stats and still get the the multicolumn output, e.g
`./run_scm.bash -p clubb_params_multi_col.in -e arm`
-----------------------------------------------------------------------
- (3.1) Executing a restart run:
-----------------------------------------------------------------------
After a long simulation has been performed, it is sometimes convenient to
perform a new simulation that starts some time at the middle or end of the original
simulation, rather than wasting time by starting again from the initial time.
The new simulation is then called a "restart" simulation. The restart
simulation is initialized using data saved to disk by the original
simulation at the restart time.
1. Perform the original simulation of case <CASE NAME> and save the GrADS
or netCDF output files in the <CLUBB BASE DIRECTORY>/output directory.
These data files will be accessed to restart the simulation. In order
to reproduce the original simulation exactly, one must save
non-time-averaged output from the original simulation. That is,
in the _model.in file, one must set stats_tsamp to the same value
as stats_tout for the original simulation.
2. Create a subdirectory in the <CLUBB BASE DIRECTORY> called "restart" and
move the GrADS output files to that subdirectory.
3. Edit the following three variables at the end of the &model_setting section of
the model.in file:
l_restart = .true.
restart_path_case = restart/<CASE NAME>
time_restart = initial time of restart run in seconds
Compute time_restart as (time_initial + n_out * stats_tout), where n_out
is the number of output intervals before the restart time.
4. Execute the run as usual from /run_scripts using
./run_scm.bash <CASE NAME>
-----------------------------------------------------------------------
- (3.2) Executing a single-column run with fields input from LES:
-----------------------------------------------------------------------
One supported mode of running clubb is to use GrADS or netCDF data from either a
prior CLUBB run or a horizontally averaged set of data from an LES to
set some of the prognosed variables to the data set's values at each timestep.
E.g. If desired, the horizontal winds (variables um and vm in the code)
could be fixed to the COAMPS-LES value at each timestep, while the other
fields will evolve as in the standard single-column run.
Currently, we have only tested the code with data from COAMPS and SAM-LES. Data
from CLUBB also works, but should result in similar results to not
using input_fields and so is less useful.
The relevant namelist files are in the input/case_setups/<Model Case>_model.in files.
To execute an input fields run, you first need to set the variable l_input_fields to
.true., and then configure a separate namelist called &setfields, which controls the
data and variables that are read in. First, set the 'datafile' variable in the
&setfields namelist to the location of the data files, and set 'input_type' to ether
"clubb", "coamps_les", "sam", or "rams_les" depending on the type of data you want to
read in. Finally, set 'l_input_<varname>' to .true. for those fields for which you
want to use a fixed value from the LES dataset at the beginning of each timestep.
Then, change your directory to run_scripts and execute the run_scm.bash
as you would usually.
Nota bene: The GrADS data files cannot have a time increment less than 1mn.
Therefore, when a file is output in CLUBB with a stats_tout of less than
60, the code will simply round up, which will not work for using the
resulting GrADS data file generated for an inputfields simulation.
Therefore, when l_input_fields is true, always use GrADS data output
at 1mn increments or greater.
Note the input fields code will also work for tuning runs, if the tuner is
configured to use a <Case Name>_model.in with the input fields options
enabled.
-----------------------------------------------------------------------
- (3.3) Executing a tuning run:
-----------------------------------------------------------------------
The "tuner" code is used to optimize CLUBB's parameters in order to better match
output from a 3D large-eddy simulation (LES) model. The default optimization
technique is the downhill-simplex method of Nelder and Mead, as implemented in
Numerical Recipes In Fortran 90, 3rd Ed. (amoeba.f90). You will either need special
access to the CLUBB repository, or your own license from Numerical Recipes to
use the default algorithm. In the latter case, the files that need to be placed in
the directory src/Numerical_recipes are:
amebsa.f90 nr.f90 nrutil.f90 ran1_v.f90
amoeba.f90 nrtype.f90 ran1_s.f90 ran_state.f90
You may need to separate out any USE statements delimited by a semicolon in
order to make the files work properly with the mkmf script.
For a tuning run, CLUBB must be compiled as described in Chapter 1 with one exception:
it is advised to use the command-line option -t/--tuner when calling the compile.bash script
in order to avoid crashed while tuning:
$ ./compile.bash -t [other options]
Using this option will set a compiler flag that enables additional checks needed to avoid
floating-point issues when running CLUBB with unusual parameter sets.
In general, this change will likely not be bit-changing since it will only affect the output
in extreme cases.
Do steps 1, 2, & 3 as outlined in the description of a standalone run (Sec 2.1).
4. Edit input_misc/tuner/error_<CASE NAME>.in or select an existing one. Note that
there are two tuning subroutines, specified by tune_type in the
error_<CASE NAME>.in /stats/ namelist.
If tune_type = 0, then the amoeba subroutine, which implements the downhill
simplex algorithm, will be used. If runtype is 1, then amebsa,
a variant of amoeba which uses simulated annealing instead, is used. A complete
explanation of these minimization algorithms can be found
in "Numerical Recipes" by Press et al., .
The current default is tune_type = iesa = 2 and l_esa_siarry = .false.,
which runs a variant of simulated annealing developed by Gunther Huebler.
The parameters whose values are adjusted by the tuner
are the non-zero entries in namelist initmax.
Sometimes the names of the optimized variable in the CLUBB and LES output
will differ. Note that when tuning against netCDF data, the file will
need to have a .nc extension for the clubb_tuner to correctly identify
the file as being in netCDF format.
For more information on the tuning parameters go to input_misc/tuner/README.
5. You may also wish to set the debug_level to 0 in the file
input/case_setups/<CASE_NAME>_model.in to speed up tuning.
Doing this will also help make the output match the nightly tuning run.
You may need to set stats_tout to 60.0 in order to match the
time interval of LES output data. The tuner also supports
setting l_input_fields
to .true. for for the purposes of `fixing' variables such as um or vm for
the purposes of isolating model errors.
6. Edit run_scripts/run_tuner.bash to use your namelists. Note that run_tuner.bash uses
a customized stats file for tuning and a (possibly different) stats file
to run CLUBB with the optimized parameters. The stats file used while
tuning is input/stats/tuning_stats.in and contains only the names of the
variables being tuned for; this speeds up the tuning process. Therefore,
you must also edit tuning_stats.in to match the variables being tuned for
in the error_<CASE_NAME>.in file.
7. Ensure that the directory /home/pub/les_and_clubb_benchmark_runs exists on the
machine you are using. (All group computers at UWM should have this directory.)
This directory should contain the benchmark LES output that CLUBB is trying
to match. Using this output, along with setting debug_level to 0,
will produce identical output to the nightly tests tuning run.
8. Go to the run_scripts directory and execute ./run_tuner.bash
NOTE: In previous versions the run_tuner.bash script executed pre-tuning CLUBB standalone runs.
Those runs are now by default disabled and can be re-enabled by adding the *-i/--initial-output* option when executing run_script.bash
-----------------------------------------------------------------------
- (3.3.1) Creating a RAM disk (optional)
-----------------------------------------------------------------------
One means of speeding up tuning runs is reducing the time spent writing
to the hard disk. Most operating systems support a virtual device called
a ram disk, which is main memory that has been allocated to act as an emulated
file system. Note that you will need system privileges to make the ram disk,
and files copied to the ram disk are not preserved when the computer is
powered off.
Generally:
1. Create and mount RAM disk on "output"
2. Run tuner
Linux Example
Note that you will need ram disk support compiled into your kernel, which is
typically the default on most systems. Linux appears to be less flexible
about when you are allowed to change the ramdisk size.
1. In grub.conf
Append to 'kernel' line:
kernel /vmlinuz-2.4.21-40.EL ro root=LABEL=/ ramdisk_size=262144
This sets ram disks to be 256 megabytes in size. Note that your own system may
already have other options besides the ramdisk_size on this line.
2. $ mkfs.ext2 /dev/ram0
3. $ mount /dev/ram0 /home/dschanen/clubb/output
4. $ cd run_scripts
(Run your job)
Solaris Example
Note that these instructions only apply to Solaris 9 & 10
1. $ ramdiskadm -a clubb 256m
Creates a virtual disk clubb that is 256 megabytes in size.
2. $ newfs /dev/ramdisk/clubb