LAVA Receives NSF Grant to Translate Human Speech into Data Visualizations
Oct 29, 2020
In 2008, Jason Leigh, ICS Professor and director of LAVA, began groundbreaking research to develop Articulate, a system to teach a computer how to translate human speech into visualizations of data.
Recently Leigh and his colleagues at the University of Illinois at Chicago received a $498,999 National Science Foundation award (2008986 & 2007257) to develop Articulate+.
Articulate+ pushes the needle forward on the grand challenge problem of translating natural language queries into effective data visualizations, that has the potential to democratize and increase the use of emerging data visualization and analytics approaches. The need for users to learn a new user interface to create and manipulate a new analytics tool can be a considerable barrier for adoption of those techniques. Even scientists who routinely use data visualization and analytics tools for their work continue to use the same tools even though superior alternatives may exist because of their uncertainty in investing time learning to use the new tool. Imagine if that burden could be lifted with a more naturalistic user interface comprising natural language and gesture allowing them to interact as if they were speaking with a competent lab assistant who is savvy in the latest data visualization techniques. This is the focus of Articulate+, which will attempt to create such an assistant. The approach is to create an interaction paradigm between the users and the computer where the computer behaves more like a collaborator with the human rather than just a tool. Articulate+ treats the imprecise/vague nature of Natural Language queries and gestures not as a problem that must be overcome, but as an opportunity to be exploited, allowing us, and ultimately an intelligent software system, to learn more about the underlying intent of the users and provide better sets of visualizations that help the users in generating insights.