mardi 18 janvier 2022

Running C++ based Rivet Monte Carlo routine in a Cluster

I am running a C++ routine which outputs a text file including the four momentum of physics partile.On local machine,it works perfectly and output a text file on the current directory.When I run the routine on cluster,I dont get a text file as an output.When the routine is run,I get files in two directories yoda(contains the information to plot histogram) and logs(for sumbit.err and submit.out files). I am mentioning these two directories in a submit.sh file as

#! /bin/bash

export JO=$1
export DATASET=$2
export WORKDIR=/beegfs/user/standalone/

if [ -z $3 ]; then
export RUN=`echo $DATASET | awk -F. '{print $2}'`
else
export RUN=`echo $DATASET | awk -F. '{print $1}'`
fi

sbatch -o ${WORKDIR}/logs/submit_${RUN}.out -e ${WORKDIR}/logs/submit_${RUN}.err  -p normal --job-name Rivet_${RUN} --export 
WORKDIR,RUN,JO,DATASET run_rivet_onnode.sh

My question is how can I confgiure this submit.sh so that the file that I create and write on the runtime using ofstream could appear in the logs directory after the run.

Aucun commentaire:

Enregistrer un commentaire