NlViS is an EPSRC funded research project that aims to push the frontiers of exploratory data analysis through novel combinations of interactive visualisations and natural language understanding. The project aims to develop a fundamental understanding of how analysts can use natural language elements to perform visualisation empowered data analysis and use that understanding to develop a framework where natural language and visualisation based interactions operate in harmony. The project then aims to demonstrate how such a multi-modal interaction scheme can radically transform the analysts' experience with the goal of achieving significant improvements in the value and the volume of actionable observations generated.
The project runs from September 2017 to February 2019 and is supported by the EPSRC First Grant Scheme awarded to Dr Cagatay Turkay from giCentre at City, University of London, and Redsift will be a collaboration partner in the project sharing their technological know-how in conversational modelling. Further details about the grant are on the grant portfolio pages for EPSRC.
Image credits: Mock-up by Burcu Turkay, artwork by Jemis Mali & Maria Kislitsina from the Noun Project