Categories
Digital Humanities Mapping

Comparing Networks and Visualization Tools

There is a growing number of online research tools that are having a significant impact on the field of digital humanities. Easy to use, and accessible, these tools or applications are quite powerful in their ability to permit researchers to sift through large data sets and visualize network relationships. For the past several assignments I was able to test and review three popular tools. These include Voyant, Kepler.gl, and Palladio. While each tool was unique in terms of its interface and original purpose, provided a means to an end. This is an important distinction, especially if one was starting a research project and needed to get a handle on whatever data was available.

For example, Voyant is a “text mining” application that permits the end user to enter large volumes of corpus and visualize text clouds, or tag clouds. This provides a significant capability if one is not clear as to what the text describes or includes. Kepler provides a very specific feature set that allows end users to enter geospatial data and generate a variety of maps. On the other hand, Palladio is much more robust providing several important features such as mapping, graphing, customized lists, as well as a gallery view for images.

While each application on its own provides value, the real lesson for any DH researcher is to be prepared to utilize a variety of tools to visualize and map data. This requires a level of effort to experiment and test each application’s capabilities. Voyant provides the end user a rather straight forward approach to discovering word patterns or hidden terms. Kepler provides a relatively easy way to present physical location data over time. Finally, Palladio permits the end user to visual patterns of relationships. This becomes an important factor when trying define interdependencies in a humanistic study.

As part of my class assignments using the WPA’s Slave Narratives data, the integration of all three tools would be beneficial in analyzing the 1930s research. Voyant could be used to define text patterns of the questions asked and the subjects responses. Kepler could be used to demonstrate that their was a relationship between the physical location of where the interviews were conducted, versus the location of where the enslaved person was from. Finally, Palladio using mapping, graphing, lists, and image gallery, could provide an acceptable interface to explore the final results.

Categories
Digital Humanities Mapping

Mapping History

Kepler.gl is a powerful open source geospatial analysis tool for large-scale data sets. In plain english it lets you visualize data by mapping multiple location points and letting the user use both time and distance as a means to tell a story. The system is designed for both technical and non-technical users. The key is to learn how to use the available filters to visualize the insights that you what users to explore.

The Kepler.gl workflow is based on data layers that permit the creator to present a variety of visualizations including “points”, “lines”, “arcs” and even a “heatmap”. Kepler provides a variety of map styles, color palettes, and map settings. Like most well thought out applications, users need to only spend a little time getting familiar with the interface. It is highly recommended to start with a small project to get better acquainted with Kepler’s unique and very useful features.

For my class assignment we used mapping data collected during the 1930s Works Progress Administration (WPA) Slave Narrative Collection. From 1936 to 1938 the Federal Writers’ Project undertook a major initiative to compile the histories of former slaves living in seventeen states. I was able to use data gathered in the state of Alabama. The map data displayed below shows the relationship of where the subject was interviewed and where they were originally enslaved.

Recent technologies like Kepler Gl. will provide researchers a whole new way of using maps to visualize large collections. The fact that I was able to bring alive dormant data that had been stored somewhere for over 70 years is impressive. But the key to a successful project will be obtaining consistent and accurate mapping data.

css.php