madgraph+madspin generate two processes in one file

Asked by Marzieh Bahmani

Hello,

I want to generate pp > mu n1 [QCD] with madgraph and
decay it using madspin in a way that madspin decay would be selective base on the sign of lepton generated by madgraph in privious step.

for example:

import model SM_HeavyN_Meson_NLO
        define p = u c d s u~ c~ d~ s~ g a
        define j = p
        generate p p > n1 mu aS=0 aEW=4 [QCD]

then if mu==mu+
using madspin :
      decay n1 > pi+ mu-
elif:
    decay n1 > pi- mu+

is it possible to do so?
like this I want to have all possible Lepton number conserving events in one sample.
many thanks

Cheers,

Marzieh

Question information

Language:
English Edit question
Status:
Answered
For:
MadGraph5_aMC@NLO Edit question
Assignee:
No assignee Edit question
Last query:
Last reply:
Revision history for this message
Olivier Mattelaer (olivier-mattelaer) said :
#1

Did you try the experimental feature of MG5aMC for 3.5.0:
https://cp3.irmp.ucl.ac.be/projects/madgraph/wiki/MadSpin

I think that I only tested this for LO process, but if this is a good opportunity to test if this work with NLO event.
import model SM_HeavyN_Meson_NLO
        define p = u c d s u~ c~ d~ s~ g a
        define j = p
        generate p p > n1 mu+ QCD=0 [QCD] @0
        add process p p > n1 mu- QCD=0 [QCD] @1
output

and then in madspin_card
decay n1 > pi+ mu- @0
decay n1 > pi- mu+ @1

Cheers,

Olivier

Revision history for this message
Marzieh Bahmani (marziehbahmani) said :
#2

Hi Olivier

I tried this syntax :

generate p p > mu+ n1 [QCD] @0
 add process p p > mu- n1 [QCD] @1
....
set spinmode onshell
...
decay n1 > mu- pi+ @0
decay n1 > mu+ pi- @1

it fails :

InvalidCmd : No particle @1 in model

cheers,

Marzieh

Revision history for this message
Olivier Mattelaer (olivier-mattelaer) said :
#3

Ok will take a look this week.

Because I'm curious, why do you use the onshell mode in this case? (I suspect that this is the issue here)

Olivier

Revision history for this message
Marzieh Bahmani (marziehbahmani) said :
#4

Hi Olivier

because we need the n1 decay be exactly onshell and
ensures full spin correlation of its decay products.

Cheers,

Marzieh

Revision history for this message
Richard Ruiz (rruiz) said :
#5

Hi Marzieh,

Olivier reminded me just now that since this is a 1 > 2 body decay, madspin=on can be used. madspin=onshell is *not* necessary.

This is my fault. I told you the incorrect thing previously.

best,
richard

Revision history for this message
Marzieh Bahmani (marziehbahmani) said :
#6

Hi

I tried the madspin=on for version 3.3.1 didnt work. InvalidCmd : spinmode can only take one of those 3 value: full/onshell/none
so if i dont want to use this feature, ttps://cp3.irmp.ucl.ac.be/projects/madgraph/wiki/MadSpin
then is using madspin=onshell ok?

anyway for this feature, I used version 3.5.0 with madspin=on

at some point job got stuck in a loop :
this is part of output:

03:31:17 INFO: Compiling directories...
03:31:17 INFO: Compiling on 1 cores
03:31:17 INFO: Compiling P0_udx_n1mup...
03:33:06 INFO: P0_udx_n1mup done.
03:33:06 INFO: Compiling P0_usx_n1mup...
03:34:42 INFO: P0_usx_n1mup done.
03:34:42 INFO: Compiling P0_cdx_n1mup...
03:36:45 INFO: P0_cdx_n1mup done.
03:36:45 INFO: Compiling P0_csx_n1mup...
03:40:10 INFO: P0_csx_n1mup done.
03:40:10 INFO: Compiling P0_dxu_n1mup...
03:42:59 INFO: P0_dxu_n1mup done.
03:42:59 INFO: Compiling P0_dxc_n1mup...
03:46:11 INFO: P0_dxc_n1mup done.
03:46:11 INFO: Compiling P0_sxu_n1mup...
03:49:42 INFO: P0_sxu_n1mup done.
03:49:42 INFO: Compiling P0_sxc_n1mup...
03:52:57 INFO: P0_sxc_n1mup done.
03:52:57 INFO: Compiling P1_dux_n1mum...
03:57:01 INFO: P1_dux_n1mum done.
03:57:01 INFO: Compiling P1_dcx_n1mum...
04:01:17 INFO: P1_dcx_n1mum done.
04:01:17 INFO: Compiling P1_sux_n1mum...
04:05:35 INFO: P1_sux_n1mum done.
04:05:35 INFO: Compiling P1_scx_n1mum...
04:09:08 INFO: P1_scx_n1mum done.
04:09:08 INFO: Compiling P1_uxd_n1mum...
04:12:32 INFO: P1_uxd_n1mum done.
04:12:32 INFO: Compiling P1_uxs_n1mum...
04:15:57 INFO: P1_uxs_n1mum done.
04:15:57 INFO: Compiling P1_cxd_n1mum...
04:18:55 INFO: P1_cxd_n1mum done.
04:18:55 INFO: Compiling P1_cxs_n1mum...
04:21:29 INFO: P1_cxs_n1mum done.
04:21:29 INFO: Checking test output:
04:21:29 INFO: P0_udx_n1mup
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P0_usx_n1mup
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P0_cdx_n1mup
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P0_csx_n1mup
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P0_dxu_n1mup
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P0_dxc_n1mup
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P0_sxu_n1mup
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P0_sxc_n1mup
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P1_dux_n1mum
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P1_dcx_n1mum
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P1_sux_n1mum
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P1_scx_n1mum
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P1_uxd_n1mum
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P1_uxs_n1mum
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P1_cxd_n1mum
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: P1_cxs_n1mum
04:21:29 INFO: Result for test_ME:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for test_MC:
04:21:29 INFO: Passed.
04:21:29 INFO: Result for check_poles:
04:21:29 INFO: Poles successfully cancel for 20 points over 20 (tolerance=1.0e-05)
04:21:29 INFO: Starting run
04:21:29 INFO: Cleaning previous results
04:21:29 INFO: Generating events without running the shower.
04:21:29 INFO: Setting up grids
04:21:37 INFO: Idle: 14, Running: 1, Completed: 1
04:21:45 INFO: Idle: 13, Running: 1, Completed: 2
04:21:54 INFO: Idle: 12, Running: 1, Completed: 3
04:22:02 INFO: Idle: 11, Running: 1, Completed: 4
04:22:15 INFO: Idle: 10, Running: 1, Completed: 5
04:22:22 INFO: Idle: 9, Running: 1, Completed: 6
04:22:33 INFO: Idle: 8, Running: 1, Completed: 7

and it stayed like this for 5 hours!
then job was failed with this message : [pilot:1150] "Looping job killed by pilot"

best,

Marzieh

Can you help with this problem?

Provide an answer of your own, or ask Marzieh Bahmani for more information if necessary.

To post a message you must log in.