General – THATCamp Modern Language Association Boston 2013 http://mla2013.thatcamp.org The Humanities and Technology Camp at the Modern Language Association Convention in Boston, January 2013 Wed, 09 Jan 2013 14:56:52 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.12 Notes from Teaching Digital Archives session http://mla2013.thatcamp.org/01/02/notes-from-teaching-digital-archives-session/ Wed, 02 Jan 2013 20:19:27 +0000 http://mla2013.thatcamp.org/?p=453 Continue reading ]]>

Here are the notes from the “Teaching Digital Archives” session proposed by Paul Jaussen: docs.google.com/document/d/1OvSbVBxXNqSGFiOfebiUNqTRppUZpY05bJ1Nz46YXzg/edit

Session notes

How to teach archives? How to add historical context to 19th-century poems as well as doing close reading of poems?

Emphasis point: work with librarians and archivists to develop the course and support the technology. Take students to an actual archive (if possible) and talk to actual archivists, especially about the process of digitizing.

Examples of assignments and tools:

 

]]>
Notes from network analysis http://mla2013.thatcamp.org/01/02/notes-from-network-analysis/ Wed, 02 Jan 2013 19:07:28 +0000 http://mla2013.thatcamp.org/?p=446 Continue reading ]]>

NETWORK ANALYSIS 1-2:15 Shawn

Basic network analysis: nodes (characters, places, words, books) and edges (connections between those nodes, ideas that comes up, books mentioned etc.) weight?

Software that can be used to do this:
Evoo
GELPHI
A plug for GELPHI, export to earth.
Republic of Letters
Google refine (allows you to clean up big data sets)

Interested in seeing other people’s experiences in starting a network analysis project. Digital component add on to dissertation, uses a text based. Evaluated in computer science dept. not in English dept.

The challenge:
Very few texts online of women’s writing. Some through evoo but not good. When you don’t have a corpus online digitized you need to do work on your own: respondent took high res photos of docs, TEI of documents but it took six months to do one play out of 12 texts.

Question about social network analysis in times before there were social networks. Discussant is interested in intellectual network analysis. Particular theories, letters, travel areas etc. works with Margaret Cavendish

Issues with current tech:
Well known tool is Republic of Letters which can help to map letters and see basic connections.

Cordell: think of each letter as a connection and the more letters the more connection. Who is central to network who is on periphery?

What were challenges with GEPHI?
Response: it visualized people pretty well and so could see names and locations and it was fine, but to add in other data was harder and didn’t work well. Then used neat line to map locations and texts and that didn’t necessarily line up with GEPHI.

Can you get metadata from evoo? Negotiate with proquest…

Way to use excel, node xl: allows to create a network graph in the same way you create a pie chart in excel, can tweak the various visualization options.

Plug-ins for GEPHI, one written by Dave shepherd at UCLA. Takes network graph and if you have a field with XY coordinates, it lays out the data and you can export it into google earth. Can illustrate why network graphs are different from maps, because in regards to geographies it obscures more than it illuminates. The plug in is called export to earth.

What is the design method?

Inter textual and para textual connections between author and work being published during the period

What is that intellectual network?

How to determine edges after nodes are no longer made up?
Perhaps use literary theory to determine what is a node and what are edges.
Respondent notes that he used affect theory but it got more and more complicated and edges turned out to be affects.
Turns out the edges are more interesting than the nodes, yet the nodes are totally privileged. What about developing edge-based mapping?

Ontology of network analysis what counts as nodes and what as edges?
Human centered. Animal centered. Object centered. Etc.

Novels or authors being edges? And then connected based on who is talking about those things.

Trying to get this to a point to open it up as a project that people can critique.

Other theories that might include network analysis:
Actor network theory: distributed agency models… Among or across people and commodities. Mapping networks of commodity circulation.
Anderson’s Imagined Communities and Warner’s counterpublic vs public (outside of direct social interaction that are none the less conceived of…)
The conceptual community rather than the actual community…

Tutorial
All data about reprinted texts from newspaper archives.
Nodes are individual publications
Connections between them are shared texts and the more texts two publications share the larger the edge between them. If two texts reprinted fifty of the same texts in a decade then it is perhaps a strong connection. Age before copyright law. We can speculate about who was reading who and what publications were in conversation. HOWEVER the whole graph can be inverted with texts as the nodes and publication reprints as the connections between them.

Some questions:
What are the best sources for encoding data for network analysis?
How do we determine nodes and edges?
How do we conduct rigorous selection of nodes and edges in network analysis?
How can we collaborate on encoding non digitized/create collaboratives to save time?
Is it ethical to put “ghost entries” (letters that have been burned or lost) in data analysis?

]]>
Notes from Tools for Literary Text Analysis http://mla2013.thatcamp.org/01/02/notes-from-tools-for-literary-text-analysis/ http://mla2013.thatcamp.org/01/02/notes-from-tools-for-literary-text-analysis/#comments Wed, 02 Jan 2013 16:57:35 +0000 http://mla2013.thatcamp.org/?p=434

Here is a link to the notes for the session that largely focused on Voyant and similar tools: Session Proceedings.

 

]]>
http://mla2013.thatcamp.org/01/02/notes-from-tools-for-literary-text-analysis/feed/ 1
Digitized Scholarly Editions http://mla2013.thatcamp.org/01/02/digitized-scholarly-editions/ Wed, 02 Jan 2013 09:49:46 +0000 http://mla2013.thatcamp.org/?p=421 Continue reading ]]>

How might digitization enhance scholarly (or critical) editions of literary and historical texts? I’m thinking in terms of incorporation of multimedia, accessibility of bibliographic resources, and integration of reference and/or pedagogical material (such as glossaries for foreign language texts)? What would the “dream” scholarly edition look like (from the vantage points of students as well as more advance scholars), and what obstacles would need to be overcome to implement it?

]]>
Student Storage & Processes of Multimodal Compositions http://mla2013.thatcamp.org/01/02/student-storage-processes-of-multimodal-compositions/ http://mla2013.thatcamp.org/01/02/student-storage-processes-of-multimodal-compositions/#comments Wed, 02 Jan 2013 01:12:04 +0000 http://mla2013.thatcamp.org/?p=394 Continue reading ]]>

I would like to address both the technology (hardware) involved with storing student works and their processes of multimodal composing.

I’d be interested discuss any platforms or systems you use to 1) protect the digital works of our students; 2) protect them in terms of copyright materials; 3) use their works as resources/shareable content.    In our own university and others I’ve been at it, we’ve experimented with Dropbox, Chalk & Wire, Google docs, and Blackboard (the University’s system), but I’d love to hear if there are any other programs out there, the amount of data that can be stored, and how user-friendly they are.  I’d also like to discuss ethical issues and/or fair use practices if we have time.

I’m also very interested in how students create multimodal compositions.  I’d be happy to share findings from a case study I conducted in the fall 2012, particularly students’ process of juxtaposing images and texts, of remixing videos and sound, and creating their own works with a variety of programs – and I’d love to hear your approaches to teaching these works, or theoretical works you use to integrate them in the classroom.

]]>
http://mla2013.thatcamp.org/01/02/student-storage-processes-of-multimodal-compositions/feed/ 1
Ride Available? http://mla2013.thatcamp.org/01/01/ride-available/ Tue, 01 Jan 2013 15:31:43 +0000 http://mla2013.thatcamp.org/?p=325 Continue reading ]]>

Just on a whim: Is anyone coming through (or departing) the New Haven area in the wee hours of Wednesday morning who could give me a ride? I’m trying to make my transit more sensible, but have to work it out so I’m up to Boston and back on Wednesday.

]]>
Topic Modeling? http://mla2013.thatcamp.org/12/31/topic-modeling/ Mon, 31 Dec 2012 18:54:08 +0000 http://mla2013.thatcamp.org/?p=301

Can anyone talk about topic modeling in the humanities? I certainly cannot, but would love to hear others talk about their experiences, tools, methods, and projects involving topic modeling. Anyone?

]]>
Building DH Community, Competency, Capacity http://mla2013.thatcamp.org/12/31/building-dh-community-competency-capacity/ Mon, 31 Dec 2012 18:34:21 +0000 http://mla2013.thatcamp.org/?p=285 Continue reading ]]>

Building on bernierr1‘s proposal, Designing DH Projects, I would like to explore ways of building local community around DH to generate ideas, build competencies, and determine what infrastructure is needed to support DH work in research and classrooms. We are working now on building such a community at my institution and, while we’ve had a great start with a surprising amount of enthusiasm, we’re still working out what our next steps should be. For example, how do we keep the momentum going? What do we need to get humanists working together (and learning together)? What are good “beginner” projects and methods to get folks interested and serve as examples? How do we determine what tools/methods should be supported to meet the needs of the majority? What are some solutions to hosting/customization problems? How do we maintain the focus on scholarship in DH projects? Lots of questions — let’s get together and discuss!

]]>
Omeka Neatline and spatial-temporal visualization, anyone? http://mla2013.thatcamp.org/12/31/omeka-neatline-and-spatial-temporal-visualization-anyone/ http://mla2013.thatcamp.org/12/31/omeka-neatline-and-spatial-temporal-visualization-anyone/#comments Mon, 31 Dec 2012 15:17:09 +0000 http://mla2013.thatcamp.org/?p=257 Continue reading ]]>

All:

This workshop suggestion focuses on a tool, the Neatline map tool, that in some ways follows up on the discussion about the use of Scripto for Omeka projects. It follows first because of an underlying interest in Omeka, and second, because of an interest in visualizing archival data in interesting ways (which is what I take the text and archive based work in Scripto to be in the service of accomplishing). As I am sure a number of you know — and Patrick in particular — Neatline has the benefit or detraction of being a UVA product, so it is not currently supported by the Omeka.net hosting service and its server-based plugin library. Bummer as that may be, the advantage, I think, is associated with the increased potential flexibility of the tool.

My interest has been for some time now to marry textual / archival data with cartographic and spatial matter in order to create a richer and deeper data set. Think of this as an exploration of “big data” for cartographic or spatial humanists. One of the principle difficulties I see inherent in GIS platforms is the distinction (often under-represented) between powerful visualization and presentation platforms and powerful analytical platforms. We have begun to see strongly interpretive tools in corpus analysis and data mining applications; we have not, to my mind, seen the same evolution in the realm of humanist data visualization, and in particular, those visualizations tied to time and space through GIS technologies like Neatline. So, I would open this up for a potential workshop that might address themes like: what are some of the the analytical potentials of GIS based technologies? How do we see archival inquiries and historical investigations productively in conversation with the cartographic imagination? What are the limitations of tools like Neatline, or perhaps Google Earth / GMap? I mentioned QGIS in an earlier post and it got a big thud of silence, so maybe this is a better approach, but if anyone wants to dig into technical GIS (as in ArcGIS or QGIS, I’d be open to that as well). Any takers?

 

]]>
http://mla2013.thatcamp.org/12/31/omeka-neatline-and-spatial-temporal-visualization-anyone/feed/ 3
Capturing Tweets and Twitter Networks at MLA 2013 http://mla2013.thatcamp.org/12/01/capturing-tweets-and-twitter-networks-at-mla-2013/ http://mla2013.thatcamp.org/12/01/capturing-tweets-and-twitter-networks-at-mla-2013/#comments Sat, 01 Dec 2012 21:51:52 +0000 http://mla2013.thatcamp.org/?p=162 Continue reading ]]>

Meeting to discuss strategies for capturing/sorting the MLA 2013 twitter feed and tools (such as NodeXL) for mapping the MLA 2013 Twitter network. If possible, it would be great if we could work on this as a team during the MLA and list everyone as co-authors on what is produced. An experiment in collaboration, then.

]]>
http://mla2013.thatcamp.org/12/01/capturing-tweets-and-twitter-networks-at-mla-2013/feed/ 3
THATCamp MLA / New England? http://mla2013.thatcamp.org/04/02/mla2013/ http://mla2013.thatcamp.org/04/02/mla2013/#comments Mon, 02 Apr 2012 22:32:36 +0000 http://mla2013.thatcamp.org/?p=1

Have you heard that there might be a THATCamp before next year’s Modern Language Convention in Boston? ‘Cause I’ve heard that. Interesting.

]]>
http://mla2013.thatcamp.org/04/02/mla2013/feed/ 1