August 15, 2018 | Written by: IBM Research Editorial Staff
Share this post:
The workshop Fragile Earth: Theory Guided Data Science to Enhance Scientific Discovery (FEED 2018) will take place August 20, as part of the Knowledge Discovery and Data mining conference (KDD 2018) in London. The FEED 2018 co-chairs and -organizers all welcome you to join us! FEED 2018 will bring together the research, industry, and policy communities to tackle the challenging problem of enhancing data analytics and scientific discovery on earth and environment through the integration of data, theory, and computation. Whether your primary interest lies in food security, water scarcity, energy use, climate models, or the incorporation of theory-guided approach into data-driven frameworks of scientific discovery, we invite you to attend the workshop and be part of this growing community.
FEED 2018 is a third in the series of workshops we have co-hosted around the topic of data science for food, energy, and water and related areas. Starting with the Data Science for Food, Energy and Water (DSFEW) workshop at KDD 2016 and then Data Science for Intelligent Food, Energy, and Water (DSIFEr) in 2017, the theme and audience of these workshops have evolved around the central topic of food, energy, and water. This year, we have added a new focus on the aspect of integrating domain theory (e.g., physics) with data-driven science, which is a critical technical requirement in modeling and discovery from environmental data.
One of the papers, Learning to Detect and Count Panicles in Sorghum Images (authored by P. Olsen, K. N. Ramamurthy, J. Ribera, Y. Chen, M. Tuinstra, A. Thompson, R. Luss and N. Abe), to be presented in the afternoon session (on “food”), represents the outcome of joint collaborations between IBM Research and Purdue University, within the DOE ARPA-E-funded project “TERRA”. The goal of this project is to develop advanced machine learning methods of automated phenotyping and genotype-phenotype association for the purpose of accelerating breeding for bio-fuel crops. In the paper, we report on our recent progress on one aspect of automated phenotyping: counting the number of panicles given drone-borne image data from the field. This allows us to accurately estimate one of key phenotypes input to a physical crop model, leading to a state-of-the-art physics-guided, data-driven phenotyping model.
Another paper, Teaching Machines to Understand Data Science Code by Semantic Enrichment of Dataflow Graphs (authored by E. Patterson, I. Baldini, S. Mojsilovic and K. R. Varshney), highlights recent progress towards the semantic analysis of code. The team of IBM Researchers developed an automated method that enables computers to make sense of the execution traces, semantic representations, statistical models, and machine learning workflows implicit in source code. The technique could help data scientists analyze complex data sets efficiently by identifying existing code from a similar program. Learn more about this work in a blog post from the research team.
At FEED 2018, we envision researchers and practitioners from both the KDD and domain communities coming together to collaboratively innovate and develop solutions to tackle the many problems of critical importance around sustainability and its various dimensions. Specific topics will include:
- Paradigms for enhancing scientific discovery through theory guided data science
- Empirical investigations at the intersection of the earth sciences/sustainability and data
- Data-informed Food/Energy/Water/Earth Sciences policy discussions
- Frameworks for helping the scientific and KDD communities to work together
Please plan to join us in London! You can register online via the KDD 2018 website.
In addition, we are concurrently launching a special issue of the journal Frontiers in Big Data on the topic of big data for food, energy and water. Information on the special issue and the call for papers can be found here. Submissions will be accepted until February 2019 so there is plenty of time to prepare your papers!