Checking for non-preferred file/folder path names (may take a long time depending on the number of files/folders) ...
This resource contains some files/folders that have non-preferred characters in their name. Show non-conforming files/folders.
This resource contains content types with files that need to be updated to match with metadata changes. Show content type files that need updating.
Authors: |
|
|
---|---|---|
Owners: |
|
This resource does not have an owner who is an active HydroShare user. Contact CUAHSI (help@cuahsi.org) for information on this resource. |
Type: | Resource | |
Storage: | The size of this resource is 1.9 MB | |
Created: | Aug 31, 2022 at 3:43 p.m. | |
Last updated: | Nov 08, 2023 at 8:41 p.m. | |
Citation: | See how to cite this resource |
Sharing Status: | Public |
---|---|
Views: | 1472 |
Downloads: | 76 |
+1 Votes: | Be the first one to this. |
Comments: | No comments (yet) |
Abstract
This notebook demonstrates the setup for a typical WRF-Hydro model on HydroShare leveraging different tools or services throughout the entire end-to-end modelling workflow. The notebook is designed in such a way that the user/modeler is able to retrieve datasets only relevant to a user-defined spatial domain (space domain), for example, a watershed domain of interest and time domain using a graphical user interface (GUI) linked to HPC. In order to help users submitting a job on HPC to run the model, they are provided with a user-friendly interface that abstracts away details and complexities involved in the HPC use such as authorization, authentication, monitoring and scheduling of the jobs, data and job management, and transferring data back and forth. Users can interact with this GUI to perform modeling work. This GUI is designed in such a way to allow users/modeler to 1) select the remote server where the HPC job will run, 2) upload the simulation directory, which contains the configuration files, 3) specify the parameters of the HPC job that the user is allowed to utilize, 4) set some parameters related to the model compilation, 5) follow-up on the submitted job status and 6) retrieve the model output files back to local workspace. Once the model execution is completed, users can easily have access to the model outputs on HPC and retrieve them to the local workspace for visualization and analysis.
Subject Keywords
Content
How to Cite
This resource is shared under the Creative Commons Attribution CC BY.
http://creativecommons.org/licenses/by/4.0/
Comments
There are currently no comments
New Comment