Mapping the Great Lakes with data

Over the last couple of weeks data crunching has been my primary goal. For my part in Dan Egan's project I have been working generating mapping graphics to show a history of the St. Lawrence Seaway as well as that of Invasive species in the Great Lakes-Seaway region. In order to portray these histories I have began using two specific interactive tools. 

StoryMap JS is a Knight Lab App from Northwestern University. It allows for the combination of medias to map a set of events, information, dates, etc. Specifically, I am using this tool to map a timeline for the St. Lawrence Seaway. 

TileMill is another mapping tool with more features. It is an extension of Mapbox and allows for the addition of CSS coding to style maps and includes other interactive elements as well. I have been working with this application to map the introduction of certain aquatic invasive species in both the Canadian and United States areas of the Great Lakes. 

Port location data on the Great Lakes using Mapbox's TileMill application. 

Port location data on the Great Lakes using Mapbox's TileMill application. 

Data sorting for both these projects has proved to be challenging. In order to map invasive species onto the great lakes I must first geocode locations of where these species were discovered to use as markers on my map. For example, zebra mussels – which remain one of the most damaging and widespread invasive species to this day – were introduced into western Lake Erie in June of 1988. In order to map this I need to retrieve an exact latitude and longitude point near Lake St. Clair and western Lake Erie that I can pinpoint as the area they were first discovered. 

Alongside my project with Egan and the Journal Sentinel, I have been looking into data on the economics of the Great Lakes-Seaway shipping industry. Although this is for another class, the data from the shipping industry through the adjoining waterways could be another possible graphic and contribution in the coming months. As I will be continuing this project into the summer months as the JS O'Brien Fellowship Intern, I look forward to continuing this form of data mining and integrating the information in an inventive way.