Tools:
- Jupyter notebooks: we will use an online environment setup within the Humanities Cluster network.
- UNIX Command line tools
- OpenRefine
- for data normalization and exploration
- also look at customized distributions and extensions for e.g. LOD, RDF, NER, …
- Text Editor:
- Excel, Libre Office Calc, Google Spreadsheet or some other spreadsheet software
- for spreadsheet analysis
- pivot table for linking and classifying data through grouping, co-occurrence
- Voyant Tools no installation needed
- online text analysis tool, note that this requires uploading your data to the web
- for user-friendly exploration of texts
- Palladio no installation needed
- online (optionally) historical visualisation tool bij csv/json input
- for maps (you need to provide coordinates in csv)
- for networks and some other visualisations
Optional tools
For those who want to work with more advanced tools:
- Anaconda (Python distribution with many packages and easy installer)
- for more complex filtering, linking, classifying
- Jupyter notebook (interactive Python environment in the browser)
- Topic modelling:
- Mallet (command line) of Topic-Modelling-Tool (GUI)
- Natural Language Processing:
- Structured data handling in Python