I have found these notebooks very useful in 2 ways besides presentations: as a final exploratory data analysis front end that loads data from a larger modeling and data reduction system, and as a playground to mature workflows into utilities or modules that will later be integrated into a back end reduction or analysis system.
The models run on a small cluster and/or a supercomputer, and the data reductions of these model runs are done in python code that dumps files of metrics (kind of a GBs -> MBs reduction process). The notebook is at the very tail end of the pipeline, allowing me to make ad hoc graphics to interpret the results.
The models run on a small cluster and/or a supercomputer, and the data reductions of these model runs are done in python code that dumps files of metrics (kind of a GBs -> MBs reduction process). The notebook is at the very tail end of the pipeline, allowing me to make ad hoc graphics to interpret the results.