Editing
Marenostrum
Jump to navigation
Jump to search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
go back to [[Main Page]], [[Computational Resources]], [[Clusters]], [[External Resources]] Early in 2004 the Ministry of Education and Science (Spanish Government), Generalitat de Catalunya (local Catalan Government) and Technical University of Catalonia (UPC) took the initiative of creating a National Supercomputing Center in Barcelona. BSC-CNS (Barcelona Supercomputing Center – Centro Nacional de Supercomputación) is the National Supercomputing Facility in Spain and was officially constituted in April 2005. BSC-CNS manages MareNostrum, one of the most powerful supercomputer in Europe ([http://www.top500.org]), located at the Torre Girona chapel. The mission of BSC-CNS is to investigate, develop and manage information technology in order to facilitate scientific progress. With this aim, special dedication has been taken to areas such as Computational Sciences, Life Sciences and Earth Sciences. [http://10.0.7.240/wiki/images/files/Marenostrum/BSC.pdf User's Guide Manual ] How to connect to your bsc account: ssh -X {username}@mn4.bsc.es (or mn3.bsc.es or mn2.bsc.es or mn1.bsc.es) Usualy the username is "iciqNNNNN" where N are different numbers for each user. How to kill the jobs mncancel <jobid> To check the status of your jobs: mnq To check a job: checkjob <job_id> obtains detailed information about a specific job, including the assigned nodes and the possible reasons preventing the job from running. To know the estimated time to start execution: mnstart <job_id> To block a job: mnhold -j <job_id> To release a job, the same command must be run with -r option. To see the usage of space quota -v == ADF == * For''' Carles Bo''' group: How to check the status of your jobs llme How to check the status of the entire group's jobs llgr On the first connection (and only then), remember to set up your environment cat /gpfs/projects/iciq38/BASHRC >> .bashrc How to submit jobs qs -n <Numbers_Proc> <Input_Name> <time_in_hours> (for example: qs -n 16 test.in 36h) It seems better to use multiple of 4 for the number of processors 36h is the upper limit for now with the low priority we have How to transfer an input-file from your local computer scp <input-file> {username}@mn1.bsc.es:'/home/iciq38/{username}/DEST_MN_directory' WATCH OUT!! At variance with kimik, one should use the "real" adf input: #! /bin/sh $ADFBIN/adf << eor Here you put your input like you would do on Kimik eor If you need to keep the TAPE21 file, copy it back at the end of the job, namely after <eor>. (for example: cp TAPE21 $HOME/example.t21) == NWChem == To submit [[NWCHEM]] calculations in Marenostrum you will need to submit a .cmd file. You can automatically create the cmd file and submit the job by using the script: [[Send_mn.sh]]. You will need to copy it in your own bin and call it by typing '''send_mn.sh namefile n XX:XX:XX''', where '''n''' is the number of nodes you would like to use and '''XX:XX:XX''' the time you will allow for the job to run. Be aware that apart from the options that appear in send.sh, more options can be described in the cmd, such as the type of job... NWChem is quite slow, so it is recommended to use at least 64 nodes. == DLPOLY == == GAMESS == Last update: Marsh 18 2009 1 - Available versions: 24 MAR 2007 (R6): /gpfs/apps/GAMESS/2008-03-19/gamess.00.x 12 JAN 2009 (R1): /gpfs/apps/GAMESS/2009-03-09/gamess.00.x 2 - Submiting script example: Paste the following script in a .cmd file and use the classical command to submit it. mnsubmit gamess_file.cmd ------------------------------------------------------------------ #! /bin/bash # # @ initialdir = . # @ output = gamess_file.out # @ error = gamess_file.err # @ total_tasks = <NbProc> #@ wall_clock_limit = hh:mm:ss EXE=/gpfs/apps/GAMESS/2009-03-09/bin/rungms INPUT=gamess_file sl_get_machine_list > node_list$SLURM_JOBID rm node_list-myri$SLURM_JOBID cat node_list$SLURM_JOBID | while read node; do echo $node-myrinet1 >> node_list-myri$SLURM_JOBID done ${EXE} ${INPUT} 00 $SLURM_NPROCS node_list-myri$SLURM_JOBID rm node_list$SLURM_JOBID rm node_list-myri$SLURM_JOBID ------------------------------------------------------------------ rungms will run the gamess.00.x executable If no scratch directory are specified, you could recover your .dat file in the following directory /gpfs/scratch/usergroup/username/tmp/ 3 - Manual: Gamess' manual could be found at the following website http://www.msg.ameslab.gov/GAMESS/documentation.html Carefull: it is possible that some keywords change between the two proposed versions == VASP == [[Useful scripts]] == Molden== You can use Molden 4.8 by including /gpfs/apps/MOLDEN/4.8/ in your path (PATH="${PATH}":/home/iciq26/iciq26280/bin/:/gpfs/apps/MOLDEN/4.8/) == Contact == '''BSC Support''' e-mail : support@bsc.es Fax : 934137721 '''User Support''' David Vicente e-mail : david.vicente@bsc.es Phone : 934054226 == Links == [http://www.bsc.es/ Barcelona Supercomputing Center]
Summary:
Please note that all contributions to Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Wiki:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Navigation menu
Personal tools
Not logged in
Talk
Contributions
Log in
Namespaces
Page
Discussion
English
Views
Read
Edit
View history
More
Search
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Tools
What links here
Related changes
Special pages
Page information