Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Science and research code is currently a hot topic.   We see ongoing discussions around citations and credit for writing science software, sustainability and transparency of such software, and reproducibility of scientific results, which generally means runnable code.

One aspect of those discussions involves the evaluation of science code for recommended practices and for evaluating understanding project maturity as projects progress from prototype to operational systems. The ESIP Products & Services committee <http://wiki.esipfed.org/index.php/Products_and_Services> has begun this discussion through the recently initiated, NASA-supported AIST TRL Evaluation project, which aims to determine independent criteria for evaluating the technical readiness of a project, including its software.  

With that as a starting point, and leveraging the work of the Software Sustainability Institute <http://www.software.ac.uk/>, this meeting will be not just a discussion, but a brainstorming session on science software evaluation in the BESSIG community.  The session will be led by Soren Scott.   We expecially especially want active science/research software developers to participate.  If you develop scientific software, please come.   Possible topics include:

...