User Tools

Site Tools


2017:groups:tools:recasting

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
2017:groups:tools:recasting [2017/06/22 01:50]
sylvain.fichet
2017:groups:tools:recasting [2018/02/05 11:50] (current)
sezen.sekmen
Line 48: Line 48:
      - Recast the analysis for an other new physics model and compare the results.      - Recast the analysis for an other new physics model and compare the results.
      - Go to point one and choose a more complicated analysis...      - Go to point one and choose a more complicated analysis...
 +
 +AB: would be interesting to see how Delphes performance looks without analysis-specific cards, since a lot of people (outside the "​big"​ recasting groups) are using it that way.
  
   * How to validate the analyses. ​   * How to validate the analyses. ​
Line 64: Line 66:
         * ?K HepMC events with MG5_aMC LO, masses: gluino 1100, N1 700 --> Olivier+Nishita         * ?K HepMC events with MG5_aMC LO, masses: gluino 1100, N1 700 --> Olivier+Nishita
            * **Results:​** [[2017:​groups:​tools:​recasting:​results_gluino_1100_N1_700|here]] ​            * **Results:​** [[2017:​groups:​tools:​recasting:​results_gluino_1100_N1_700|here]] ​
-    ​- **arxiv:​1704.03848** - Monophoton - ATLAS - 13 TeV +        * LHADA implementation:​ [[https://​github.com/​lhada-hep/​lhada/​tree/​master/​analyses/​ATLASSUSY1605.03814]] 
-    - **CMS-SUS-16-039** - 3 leptons + MET - CMS - 13 TeV +    ​- **arxiv:​1704.03848** - Monophoton - ATLAS - 13 TeV  ​Cutflow:​ https://​atlas.web.cern.ch/​Atlas/​GROUPS/​PHYSICS/​PAPERS/​EXOT-2016-32/​. hepmc files: /​eos/​user/​p/​pgras/​Houches2017Recast/​DM_monophoton/​hepmc.1/​. Accessible from [[https://​cernbox.cern.ch/​index.php/​s/​8EZVNwJbSlovEBF|https://​cernbox.cern.ch/​index.php/​s/​8EZVNwJbSlovEBF]]. Asked Philippe Gras for direct access permissions to the eos directory.  
-    - **arxiv:​1706.04402** - 1 lepton + MET + Jets (>=1b) - CMS - 13 TeV+        * LHADA implementation:​ [[https://​github.com/​lhada-hep/​lhada/​tree/​master/​analyses/​ATLASEXOT1704.03848]] ​                       ​ 
 +    - **CMS-SUS-16-039** - (Now superseded by paper: http://​cms-results.web.cern.ch/​cms-results/​public-results/​publications/​SUS-16-039/​index.html) ​3 leptons + MET - CMS - 13 TeV (BDT with ~15 inputs; eff. 20-90%). ​ Cutflows: http://​cms-results.web.cern.ch/​cms-results/​public-results/​preliminary-results/​SUS-16-039/​index.html ​  ​Efficiencies:​ https://​twiki.cern.ch/​twiki/​bin/​view/​CMSPublic/​SUSMoriond2017ObjectsEfficiency 
 +    - **arxiv:​1706.04402** - 1 lepton + MET + Jets (>=1b) - CMS - 13 TeV (topness variable?)
  
 == References == == References ==
Line 75: Line 79:
   * [[2017:​groups:​tools:​Contur|Contur]]   * [[2017:​groups:​tools:​Contur|Contur]]
    
 +
 == Simplified likelihood framework == == Simplified likelihood framework ==
 +--> Andy, Sylvain
  
 +CMS formalism:
 https://​cds.cern.ch/​record/​2242860/​files/​NOTE2017_001.pdf https://​cds.cern.ch/​record/​2242860/​files/​NOTE2017_001.pdf
  
 +AB implementations in GAMBIT and SciPy, marginalising over correlated background uncertainties (by unitary transformation + integral, and by MC sampling respectively). MadAnalysis:​ (Benj: I would like to do it, but time is my main problem. Anyone to help here?  AB: Maybe my Python code, when finished?)
 +
 +AB: reporting of SR n & b arrays and covariance matrix (matrices?) currently ad hoc / non-standardised. Would be //really// good to establish a standard -- ideally in HepData.
 +
 +Canonical example: CMS 0-lepton search with 174 SRs and covariance matrix:
 +http://​cms-results.web.cern.ch/​cms-results/​public-results/​publications/​SUS-16-033/​index.html
 +
 +Improvements to the basic CMS proposal:
 https://​arxiv.org/​abs/​1603.03061 https://​arxiv.org/​abs/​1603.03061
  
-Improvements ​of the basic proposal ​of the CMS note:+  * Use of exponential nuisance parameters to avoid negative rates. 
 + 
 +  * Implement a covariance matrix dependent on the parameters ​of interest. Happens for example if there are uncertainties on both signal and background. Depends on availability of elementary sources of uncertainties. If released as weights, will open possibilities. 
 + 
 +SF: Simplified likelihoods as an alternative to unfoldingcomparison between both methods can be done in a specific example 
 + 
 +== LHADA ==
  
-  * Use of exponential nuisance parameters to avoid negative rates. Implementation ​in GAMBIT, tests using the examples given in CMS note.+Examples ​of analysis descriptions ​in LHADA format:
  
-  ​Implement covariance matrix dependent on parameters of interestFor exampleif there are uncertainties on both signal and backgroundAvailability of elementary sources of uncertainties?​ Release as weights? Will open possibilities.+   https://​github.com/​lhada-hep/​lhada/​blob/​master/​lhada2rivet.d/​CMS-PAS-SUS-16-015.lhada 
 +   * https://​github.com/​lhada-hep/​lhada/​blob/​master/​lhada2rivet.d/​CMS-PAS-SUS-16-015.lhada
  
-  * Implementation in GAMBIT, MadAnalysis(?​)+A first version of arxiv:​1605.03814 is written. ​ It will be added/​linked here after some cleanup.
  
  
2017/groups/tools/recasting.1498089004.txt.gz · Last modified: 2017/06/22 01:50 by sylvain.fichet