PUBLICATIONS

Towards Effective Visualizations for Social Network Analysis: Empirical Study of Human Sensemaking with Network Visualizations

Jenkins1, M., Bisantz2, A., Llinas2, J., and Rakesh3, N.

XXXV Sunbelt Conference of the International Network for Social Network Analysis (INSNA), Brighton, United Kingdom (June 2015)

Network visualizations are being used increasingly across a range of domains to support visual analysis of complex datasets. However, little empirical research exists on what types of tasks they effectively support and how to design them to do so. Instead, designing network visualizations is often carried out with the goal of maximizing visible information in an aesthetic form, with the expectation that information will transfer to viewers’ heads in a straightforward manner. This is largely due to a lack of empirical evidence justifying their use and providing concrete guidance for visualization designers. To begin to address these challenges, we present summarized results of an empirical investigation evaluating the effectiveness of visualizations at supporting five tasks necessary for identifying patterns and exceptions in represented data to enable generation and refinement of hypotheses to discover unknown phenomena (i.e., human sensemaking). A basic visualization design was compared to both a tabular display and two visualizations that used additional graphical variables to encode data attributes in the primary visualization view. To summarize performance benefits across 100+ dependent task performance measures, mean performance rankings were generated across displays. Results showed that basic visualizations often failed to offer performance benefits over non-visualization displays for both foraging and knowledge-based sensemaking tasks, frequently resulting in decreased performance, while graphically-enhanced network visualizations provided significant performance benefits. This finding provides empirical support illustrating that network visualizations, when not designed to meet data, task, and viewer requirements can actually hinder performance, but when appropriately designed can significantly improve sensemaking task performance.

 

 

1 Charles River Analytics
2 University of Buffalo, The State University of New York
3 University of Illinois at Urbana-Champaign

For More Information

To learn more or request a copy of a paper (if available), contact Michael Jenkins.

(Please include your name, address, organization, and the paper reference. Requests without this information will not be honored.)