Hello!
I am encountering Segmentation Faults each time I attempt to run the Simple Test Case provided by NCEP found here (https://ftp.emc.ncep.noaa.gov/EIB/UFS/simple-test-case.tar.gz) and am unsure how to troubleshoot the issue. I have reconfigured the jobs multiple times from the original sbatch (SLURM) job submission script to ensure the job processes are not running into "out of memory" issues, but the problem continues and is always at the same point in the start of the run.
For background, I have built all dependencies and UFS from source using the UFS-v1.1.0 github repo releases for the dependencies and the UFS weather model itself. This is the first test run I have attempted to do since compiling everything. Compilation from source was necessary as I am building the model on the local university-owned supercomputer running CentOS 8. All dependencies were built using gcc 10.1.0 and were the source codes from the github repos with the exception of cmake (3.17.3) and openMPI (4.0.4) which were already pre-installed on the system.
All dependencies (NCEPLIBS-external, NCEPLIBS, ufs-weather-model) were compiled with gcc 10.1.0 without errors outside of general compilation warnings and notes. I also checked a few times that each library was calling the libraries compiled from source and not other versions that may already be on the system to ensure it wasn't related to not having the same compilers and libraries used throughout the compilation process.
Attached are the console output, console error, and ufs-generated logfile from the ufs-weather-model simple test case test run showing the segmentation fault. The problem backtace seems to indicate that the ESMF may be where the issue is happening? Also, at the start of the run, there is this line "Error: coll_hcoll_module.c:301 - mca_coll_hcoll_comm_query() Hcol library init failed" which a web search says is related to the UCX init failure also listed, but I'm unfamiliar with UCX and unsure if this is related to the segmentation fault or not.
Any help is appreciated, and thank you in advance.
Tim