User Tools

Site Tools


2019:groups:tools:mlgenerativemodels

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
Last revision Both sides next revision
2019:groups:tools:mlgenerativemodels [2019/06/22 18:07]
roberto.ruiz_de_austri created
2019:groups:tools:mlgenerativemodels [2019/06/24 16:50]
sascha.caron
Line 1: Line 1:
-Topics ​we like to study:+**ML: Generative model projects** 
 + 
 +Discussed the topics ​we like to study: 
 + 
 +  * Interpolation and extrapolation properties of generative models and transfer learning 
 + 
 +Aims: Define toys cases where the density d(x) is known, Sample training data from this. Then train generative model on training data. Forecast d(x) with that model. Derive pull distribution for the integral of the density in some phase space regions to check how much and if this scales better than sqrt(N) where N are the number of training data points in that phase space regions. Also derive uncertainties via envelope of networks (e.g. via MC dropout, https://​arxiv.org/​abs/​1506.02142) 
 +Find out relevant literature. 
 + 
 +- Datasets: Toy datasets build on Gaussians, etc. , ... madgraph (matrix-element^2,​ simple process with madgraph) 
 +- generative models (Gans, VAEs, ...) 
 +- try transfer learning (Zee and Zmm) etc. 
 + 
 +  * Simulator and showers 
 + 
 +Step 1: 
 +Event wise: We have datasample for various processes (ttbar, etc.), learning det. simulation. 
 + 
 +Step 2:  
 +Object wise: Shower development etc. (before and parton showering) 
 + 
 + 
 +Information about the projects: 
 + 
 +Mailing list:  
  
-  * Interpolation and extrapolation properties of generative models 
-  * Transfer learning 
-  * Simulator 
2019/groups/tools/mlgenerativemodels.txt · Last modified: 2019/06/25 10:37 by sascha.caron