Categories
Announcement DH

Now Hiring! Mellon Digital Humanities Fellow

The Washington & Lee University Library seeks to fill a two-year grant-funded position to facilitate digital humanities teaching and research that promotes the Library’s role as a center of campus-wide academic engagement. As a member of a small organization, this position requires teamwork, collaboration, and self-motivation. The library is deeply engaged in the university’s digital humanities initiatives and is committed to developing and expanding its participation in scholarly communications, digital research methodologies, digital archives, discovery tools, data science, and other critical information services needed for teaching, learning, and research. The Mellon Fellow will participate in developing the digital humanities curriculum and in teaching digital humanities research methods.

We seek a fast learner with the aptitude and passion to thrive in the technologically complex learning and research environment that is the modern academic library. Successful candidates will enjoy the freedom to develop the position in a way that aligns with their own personal and professional interests while serving the core needs of both the library and the digital humanities curriculum. The position carries with it support for research time and funding for professional development.

For more information, see the complete job listing.

Categories
DH Event on campus Speaker Series

DH Speaker Series: Alex Gil

Join us for a talk by Alex Gil, the Digital Scholarship Coordinator of the Humanities and History Division at Columbia University Libraries.

Monday March 13, 2017
12:15-1:15pm
Hillel 101
Lunch provided, please register


The Globe is Not a Circle: The New Life of Words and the Broken Scholarly Record

In this talk, Alex Gil follows the present and possible future of our scholarly production and its uneven flows around the world.Although “the scholarly record” as a concept does not translate well into other languages, and its outlines are difficult to define, its existence is not in question. At a time when our archives and libraries are in a period of transition to hybrid registers—both analog and digital—we see a shift in the divisions of labor and interpretive frameworks resulting from these changes in the production of this record. An opening for understanding these developments and design sensible practices can be found in the idea of an *infrastructural critique* advanced by Liu, Verhoebenand as a recasting of digital humanities as a hermeneutic praxis with material consequences. In particular, Gil will argue for a form of this infrastructural critique which he and others call minimal computing.

Alex Gil Alex Gil specializes in twentieth-century Caribbean literature and Digital Humanities, with an emphasis on textual studies. His recent research in Caribbean literature focuses on the works and legacy of Aimé Césaire, including work in Aimé Césaire: Poésie, théâtre, essais et discours published by Planète Libre in 2013. He has published in journals and collections of essays in Canada, France and the United States, while sustaining an open-access and robust online research presence. In 2010-2012 he was a fellow at the Scholars’ Lab and NINES at the University of Virginia. He is founder and vice chair of the Global Outlook::Digital Humanities initiative and the co-founder and co-director of the Group for Experimental Methods in the Humanities and the Studio@Butler at Columbia University. He serves as Co-editor for Small Axe: Archipelagos and Multilingual Editor for Digital Humanities Quarterly. Alex Gil is actively engaged in several digital humanities projects at Columbia and around the world, including Ed, a digital platform for minimal editions of literary texts; the Open Syllabus Project; the Translation Toolkit; and, In The Same Boats, a visualization of trans-atlantic intersections of black intellectuals in the 20th century.

Categories
DH Event on campus Speaker Series

Report on “Digital Humanities, Data Analysis and Its Possibilities”

As part of the DH Speaker Series, I attended the talk by University of Richmond Assistant Professors Lauren Tilton and Taylor Arnold in which they discussed data analysis and how they have used it in different ways in their digital humanities research. Lauren and Taylor’s presentation of the critical role of statistics and data analysis in DH was really interesting. They pointed out that statisticians often just throw out data without analyzing the information critically and presenting their findings in an interesting manner to a larger audience. They posed the question: how do we communicate our results to a PUBLIC audience? I think that their project, titled Photogrammar, is an awesome website that effectively communicates statistical analysis in a really cool way.

Taylor and Lauren were interested in analyzing the Library of Congress’ archive of photography from the FSA era. The photographers hired by the FSA were tasked with documenting poverty, largely in the American South, during the Great Depression and the Dust Bowl. Taylor and Lauren worked closely with the Library of Congress staff in order to turn their photographic collection into a user-friendly database. They computed the data and began to analyze the statistics. Lauren said their analysis caused a “fundamental change in our understanding of this collection” and opened up a whole news series of questions. For example, the data analysis showed that the number of FSA photos from the war era and the number from the New Deal era are actually quite similar. Many people associate the FSA photographers with the Great Depression and the New Deal, and may not even know that the FSA continued their photographic endeavor into World War II.

The database is super user-friendly and much easier to find what you are really looking for than the search engine on the Library of Congress webpage for the collection. On Photogrammar, you can find images based on the county they were photographed or even find images based on color palette. In the most recent segment of the project, Taylor and Lauren used a computer software to identify faces and certain images in a photograph, looking for repetition or patterns, in order to rebuild entire photo strips from a specific photographer’s camera. This feature is amazing because it allows the user to track the photographer’s line of vision, tying the visual images and story of their production together.

I became really interested in photography after taking a History of Photography course during my sophomore year winter term. Taylor and Lauren’s discussion of their project was so helpful because it showed me how data analysis (a term that somewhat intimidates me) can help people better understand and engage with a topic in the humanities. Photogrammar answers so many research questions, just through the different features of its interactive map of the U.S. It lets me see the main regions that Walker Evans photographed in or the counties that were photographed the most during the Dust Bowl. As Lauren and Taylor stated in their talk, photographs can tell us a lot about the culture and background of an era. Their project provides a simpler yet more interesting way of understanding these photographs and the culture that surrounded them.

-Hayley Soutter, DH Undergraduate Fellow

This program is made possible by a grant from the Andrew W. Mellon Foundation.

 

Categories
Conference DH Event on campus

Report on UNRH Conference 2017

This year’s UNRH (Undergraduate Network for Research in the Humanities) Conference was hosted at W&L. For those who are not familiar with them, UNRH is a group of undergraduate students interested in learning about and experimenting with innovative research methods in the humanities. Two W&L students, Lenny Enkhbold (’17) and Lizzy Stanton (’17), were part of the founding group that started the conference in 2015. “Having worked on this project for over two years, it was very rewarding to have received so much support and being able to actually experience the results. I know Lizzy feels the same way as well,” Lenny said. “We listened to the feedback from last year and tried to make the adjustments on any category that the participants from last year thought we could improve on.”

The various sessions for this year’s conference were hosted in the new Center for Global Learning. Over the weekend of January 20-22, students from different colleges and universities across the country gathered to discuss their projects and to attend DH workshops.

Formal presentations began Saturday morning. (check out the full schedule here) During the morning session, four different groups presented the cool projects they have been working on.

In the first presentation, titled “Digitizing a Church,” two students from Lake Forest College told us about their four week endeavor of creating a virtual reality of a church near their campus. The most interesting aspect of their project was it’s interactive nature; you could simply click on the stained glass windows of the church and a pop-up window would detail their importance. The students demonstrated their belief that virtual realities can help change the education industry, by allowing students to really engage with the material in a digital representation and could even replace field trips in the future.

Students from the University of South Carolina presented their app called “Ward One,” which they created in a classroom setting. The students wanted to heighten awareness about Ward One, a historically African American community that has been destroyed by development. The app allows people to explore the community as it was and highlights historical monuments in the area. The students have received immense positive feedback from the city. During the presentation, a taped interview showed a woman who had lived in the neighborhood stating that the app made her feel like “finally someone cares.”

One group, who detailed their experience creating their online game titled “Chronicle of Swashbuckling Rubbish,” were asked why they created the project. In response, they replied, “We wanted to create something and so we did.” Although the two presenters are English and music education majors at Cornell College, they found a way to manifest their different skills into a digital project.

The afternoon consisted of round robin sessions, which I was unable to attend. But Lenny, a host contact for this year’s conference, said that the afternoon was a great way to wrap-up the day. “It was nice to change up the presentation style and keep everyone fresh rather than having two more hours of sit-down formal presentations,” he said. (see photos from this year’s conference here)

Lenny, Lizzy and the rest of the leadership team seemed really excited about their progress and are already seeking volunteer’s for next year’s conference. I thought the conference was a really awesome event that allowed students to present their work to a wider audience of their peers from different schools, majors, backgrounds, etc.

Check out all the tweets from the conference here: https://storify.com/hsoutter/unrh-conference-2017

Categories
DH Event on campus Speaker Series

DH Event: Digital Humanities, Data Analysis and Its Possibilities

Visiting us from University of Richmond, Assistant Professors Lauren Tilton and Taylor Arnold will give a talk on exploratory data analysis methods, which have received limited visibility in DH. In this talk, they will give an overview of the historical developments of exploratory data analysis and statistical computing. They will show, through examples from their work on visual culture, how both have the potential to shape digital humanities projects and pedagogy.

Thursday, February 2nd, 2017
12:15-1:15
IQ Center (Science Addition 202A)
Please register



Lauren Tilton is Visiting Assistant Professor of Digital Humanities at the University of Richmond and member of Richmond’s Digital Scholarship Lab. Her current book project focuses on participatory media in the 1960s and 1970s. She is the Co-PI of the project Participatory Media, which interactively engages with and presents participatory community media from the 1960s and 1970s. She is also a director of Photogrammar, a web-based platform for organizing, searching and visualizing the 170,000 photographs from 1935 to 1945 created by the United States Farm Security Administration and Office of War Information (FSA-OWI). She is the co-author of Humanities Data in R (Springer, 2015). She is co-chair of the American Studies Association’s Digital Humanities Caucus.


Taylor Arnold is Assistant Professor of Statistics at the University of Richmond. A recipient of grants from the NEH and ACLS, Arnold’s research focuses on computational statistics, text analysis, image processing, and applications within the humanities. His first book Humanities Data in R, co-authored with Lauren Tilton, explores four core analytical areas applicable to data analysis in the humanities: networks, text, geospatial data, and images. His second book, the forthcoming A Computational Approach to Statistical Learning (CRC Press 2018), explores connections between modern machine learning techniques with connections in statistical estimation. Numerous journal articles extrapolate on these ideas in the context of particular applications. Arnold has also released several open-source libraries in R, Python, Javascript and C. Visiting appointments have included Invited Professor at Université Paris Diderot and Senior Scientist at AT&T Labs.

Categories
DH Undergraduate Fellows

Intro to Command Line: Mac Edition

Hacking on a Mac

Command Line According to Abdur

Hello and welcome to your introduction to command line on a Mac! If you have no idea what command line is, you’ve come to the right place. To clarify, we’re not actually going to be “hacking” into anything. But when you’re being especially productive in the terminal and flying along, it’s like you’ve unlocked the inner workings of your computer, and it kinda feels like hacking. If you don’t know what a terminal is or what any of this even means, don’t worry! All will be revealed soon.

To start, command line is an interface that allows the user to control features of a computer with text commands. Most people have only used the Graphical User Interface (GUI) to make their computer do what they want. The command line allows you to bypass the GUI and do everything directly and, in most cases, instantly. For example, say you want to move a file from one folder to another. The GUI involves dragging and dropping the file using the trackpad or mouse, or using the keyboard to copy and paste the file. Here’s a demo of the difference in the processes:

With the GUI, I’ll drag and drop the “Earth.txt” file into the Milky Way folder on my desktop.

Desktop

Done. That was pretty simple.

Milky Way

With the Terminal, I can do the same thing with another file called “Mars.txt”. First, I started in my home folder on my computer. When working with command line, folders are directories, which is important to remember because the computer will only recognize certain commands. For example, to move into the Desktop directory from my home directory, I have to use

$ cd Desktop

This command stands for “change directory”. We can’t do “cf” for change folder. This is because Macs use the UNIX operating system and language to run. It’s probably arbitrary, but since it’s a specific language for communicating with a computer, it’s the standard way to do this specific task. It’s just how it works. From there, I made sure I had Mars.txt in my Desktop directory with

$ ls

which lists the directory’s contents. Once I know it’s there, I use

$ mv Mars.txt ~/Desktop/”Milky Way”

Terminal

This command has three different parts. The mv is the actual command; it moves a file or directory. The two following parts are the arguments for the command. The first argument is the file and the second is the file’s desired location. Again, this is the standard on a Mac, and you should know the rules to move quickly through your directories.

So, that’s how you can accomplish one simple task with command line.

Milky Way

The app I’m using is Terminal, the default command line interface tool that comes with Macs. There are other command line apps out there, and they all come with different features. Whichever one you use depends mostly on personal preference, so I’m fine with Terminal for now. There’s a certain element of novelty when you’re using the Terminal to control your computer if you’ve only ever used the GUI to make your way through the folders and applications. On a Mac, I expected this process to be a bit difficult, but so far it’s been simple to install the various tools we’ve been using during the semester. Almost all the problems I had were because I made a mistake during the installation process.

However, there are a few reasons that computers these days are designed for GUI use. The main reason is user-friendliness. Command line can be clunky, especially if you’re new to it and are learning on your own. Commands aren’t always obvious and it takes some getting used. It’s a powerful tool once you know what you’re doing but first you have to learn. I was fortunate to have learned the basics of UNIX in my Intro to Programming class during my junior year. Coming armed with that knowledge of how to navigate directories and files meant I just had to relearn all the commands instead of memorizing them from scratch. Brandon made a very useful command line quiz that made the process even simpler. Of course, you don’t need to worry if some keywords fall through the cracks because it’s easy to look anything up, but command line is all about speed and knowing everything in your head makes it that much easier. The commands are intuitive once you get to know a few of them, and most of the time if you forget one, you can guess what it is.

Speaking of making it easier, one of the the most interesting parts of command line (to me) is Vim. Vim is an incredibly basic text editor that’s included in Terminal. For many people, it’s faster to open a .txt file or a .md file in Sublime or Atom instead of using Vim. I usually work with Terminal and a Safari window in full screen together so opening another application’s window means I can’t see my resources while I work on a file. For me it’s faster to open the file in Vim and edit it right there in Terminal rather than switch to a different text editing app. It took a while to learn because it is not intuitive at all, but once you get the hang of it, it’s a useful tool for basic files of code or text. You’ll probably hate it, and that’s fine! I know I’m in the minority when I say I like it, so I get it. All this is deeply customizable; you can change everything about your Terminal, or iTerm, or whichever app you use for command line tools. The beauty of command line is that because you’re not using the GUI, you aren’t restricted in what you can do on your computer, so you can also change everything about your computer if you want. To give you some insight, here’s a screenshot of my customized Terminal with Vim open on this blog post.

vim

I told you that command line was fun, but I’m sure all of this may not sound all that fun to you. That’s ok! The fun part comes now: Github. Aidan wrote a post about Github from his perspective which you should read to get an idea of how it works on a PC. There are some differences in getting it to work, but the process is the same. Github consists of repositories for mainly code but all sorts of data can be hosted on it. For example, the original code used to launch Apollo 11 is hosted in a repository on Github for anyone to look at. Github uses Git, a sort of programming language but simpler, to communicate with computers. After developing a long document of code or a blog post, you’ll have to push it to our Github repository. I had to look up how to do it the first few times, but the process is extremely satisfying. It’s fun to work on a project on your computer and instantly see it on the internet as soon as you’re done. I’ll be writing another post about Github soon, so don’t worry about the details just yet.

Hopefully this provided some insight on how command line and Terminal work on Macs. The resources linked in the post should help on your journey through learning all about command line tools, and if not, feel free to email/Slack me if you’d like. Happy hacking!

Categories
DH Event on campus

Open Office Hours in the DH Workspace

openofficehours

Happy New Year! We have something new on offer here at DH @ WLU. Starting in January, we’ll be holding open office hours in the DH Workspace (Leyburn 218). Every other week we will have two time slots open to students, faculty, and staff to drop by, ask questions, bring project work, or simply learn what others are working on. If you’d like to learn more about a particular DH method, just let us know and we’ll tap the Digital Humanities Action Team or our DH Undergraduate Fellows for a workshop. Potential workshop ideas include command line, Git and Github, HTML/CSS, Markdown, text analysis, TEI, or video editing. Bring your laptop and we’ll hack/yak away!


Meeting times

  • Tuesdays, 1:30-3pm
  • Thursdays, 3-4:30pm

Schedule

  • January 10 & 12
  • January 24 & 26
  • February 7 & 9
  • February 21 & 23 (Washington break)
  • March 7 & 9
  • March 21
  • March 28 & 30
  • April 4 & 6
  • April 11 & 13
  • April 18 & 20
Categories
Announcement DH Research Projects Tools

New Resource – Ripper Press Reports Dataset

[Crossposted on my personal blog.]

Update: since posting this, Laura McGrath reached out about finding an error in the CSV version of the data. The version linked to here should be cleaned up now. In addition, you will want to follow steps at the end of this post if using the CSV file in Excel. And thanks to Mackenzie Brooks for her advice on working with CSV files in Excel.

This semester I have been co-teaching a course on “Scandal, Crime, and Spectacle in the Nineteenth Century” with Professor Sarah Horowitz in the history department at W&L. We’ve been experimenting with ways to make the work we did for the course available for others beyond our students this term, which led to an open coursebook on text analysis that we used to teach some basic digital humanities methods.

I’m happy to make available today another resource that has grown out of the course. For their final projects, our students conducted analyses of a variety of historical materials. One of our student groups was particularly interested in Casebook: Jack the Ripper, a site that gathers transcriptions of primary and secondary materials related to the Whitechapel murders. The student group used just a few of the materials on the site for their analysis, but they only had the time to copy and paste a few things from the archive for use in Voyant. I found myself wishing that we could offer a version of the site’s materials better formatted for text analysis.

So we made one! With the permission of the editors at the Casebook, we have scraped and repackaged one portion of their site, the collection of press reports related to the murders, in a variety of forms for digital researchers. More details about the dataset are below, and we’ve drawn from the descriptive template for datasets used by Michigan State University while putting it together. Just write to us if you’re interested in using the dataset – we’ll be happy to give you access under the terms described below. And also feel free to get in touch if you have thoughts about how to make datasets like this more usable for this kind of work. We’re planning on using this dataset and others like it in future courses here at W&L, so stay tuned for more resources in the future.


Title

Jack the Ripper Press Reports Dataset

Download

he dataset can be downloaded here. Write walshb@wlu.edu if you have any problems accessing the dataset. This work falls under a cc by-nc license. Anyone can use this data under these terms, but they must acknowledge, both in name and through hyperlink, Casebook: Jack the Ripper as the original source of the data.

Description

This dataset features the full texts of 2677 newspaper articles between the years of 1844 and 1988 that reference the Whitechapel murders by Jack the Ripper. While the bulk of the texts are, in fact, contemporary to the murders, a handful of them skew closer to the present as press reports for contemporary crimes look back to the infamous case. The wide variety of sources available here gives a sense of how the coverage of the case differed by region, date, and publication.

Preferred Citation

Jack the Ripper Press Reports Dataset, Washington and Lee University Library.

Background

The Jack the Ripper Press Reports Dataset was scraped from Casebook: Jack the Ripper and republished with the permission of their editorial team in November 2016. The Washington and Lee University Digital Humanities group repackaged the reports here so that the collected dataset may be more easily used by interested researchers for text analysis.

Format

The same dataset exists here organized in three formats: two folders, ‘by_journal’ and ‘index’, and a CSV file.

  • by_journal: organizes all the press reports by journal title.
  • index: all files in a single folder.
  • casebook.csv: a CSV file containing all the texts and metadata.

Each folder has related but slightly different file naming conventions:

  • by_journal:
    • journal_title/YearMonthDayPublished.txt
    • eg. augusta_chronicle/18890731.txt
  • index:
    • journal_title_YearMonthDayPublished.txt
    • eg. augusta_chronicle_18890731.txt

The CSV file is organized according to the following column conventions:

  • id of text, full filename from within the index folder, journal title, publication date, text of article
  • eg. 1, index/august_chronicle_18890731.txt, augusta_chronicle, 1889-07-31, “lorem ipsum…”

Size

The zip file contains two smaller folders and a CSV file. Each of these contains the same dataset organized in slightly different ways.

  • by_journal – 24.9 MB
  • index of all articles- 24.8 MB
  • casebook.csv – 18.4 MB
  • Total: 68.1 MB uncompressed

Data Quality

The text quality here is high, as the Casebook contributors transcribed them by hand.

Acknowledgements

Data collected and prepared by Brandon Walsh. Original dataset scraped from Casebook: Jack the Ripper and republished with their permission.


If working with the CSV data in Excel, you have a few extra steps to import the data. Excel has character limits on cells and other configurations that will make things go sideways unless you take precautions. Here are the steps to import the CSV file:

  1. Open Excel.
  2. Make a blank spreadsheet.
  3. Go to the Data menu.
  4. Click “Get External Data”.
  5. Select “Import Text File”.
  6. Navigate to your CSV file and select it.
  7. Select “Delimited” and hit next.
  8. In the next section, uncheck “Tab” and check “Comma”, click next.
  9. In the next section, click on the fifth column (the column one to the right of the date column).
  10. At the top of the window, select “Text” as the column data format.
  11. It will take a little bit to process.
  12. Click ‘OK’ for any popups that come up.
  13. It will still take a bit to process.
  14. Your spreadsheet should now be populated with the Press Reports data.
Categories
Announcement DH Event on campus Pedagogy Speaker Series

Day of DH @ Winter Academy 2016

As the term wraps up, join us on Tuesday, December 13th for our “Day of DH” at W&L’s annual Winter Academy. You’ll have the chance to hear from both your colleagues and guests from the University of Virginia about digital projects and pedagogy. There will be lots to chew on, including lunch, so don’t forget to register!

10:30am-11:30am Mellon Summer Digital Humanities Research Grant
Come hear the inaugural awardees of the Mellon DH Summer Research Grants discuss the application process and their research. You will also learn about the benefits of the Mellon Summer DH Research Grant, as well as how to go about becoming a Mellon researcher. With the application deadline less than two months away, this is the perfect time to begin considering summer funding options for you and your students.
12:15pm-1:45pm Digital Humanities in a Liberal Arts Context

With support from ACS and the Mellon Foundation, W&L professors have invited UVA graduate students to facilitate workshops on digital humanities topics in their courses. On Tuesday, December 13, speakers from UVA will discuss digital humanities, pedagogy, and the collaboration. UVA graduate students, faculty, and staff will discuss their experiences working with W&L courses and also present on a variety of topics related to their research and experience teaching with digital humanities. We will have ample time for conversation, as we hope the event will seed future collaborations between people at both institutions. Lunch will be provided.
Speakers from the Scholars’ Lab at UVA include Jeremy Boggs, Nora Benedict, and Shane Lin.
Categories
Conference DH Undergraduate Fellows

#BUDSC16

The Icebreakers - the 2016 version of the annual band photo.
The Icebreakers – the 2016 version of the annual Bucknell band photo.

It’s been just over a week since I went to my first academic conference (!) and I’m ready to share my experience. I went to BUDSC 16 over the weekend of October 28 – 30th with Mackenzie and Brandon.

The Bucknell University Digital Scholarship Conference, is held at Bucknell University (duh) in Lewisburg, Pennsylvania every year. This year, Mackenzie and Brandon graciously inserted me into their presentation about writing in the context of digital scholarship and digital humanities. More on that later. Overall, the conference was an extremely interesting and enlightening experience. It was also a lot of fun and I’m thankful that my professors/supervisors are great people to go road tripping with. Besides the endless icebreakers and a wrong turn in Pennsylvania, there was a lot to take in during the days of the conference. Here’s a brief rundown of what I participated in during the conference:

  • Arrived on Friday night, too late for conference activities, so we got dinner and prepared for Saturday.
  • Saturday morning: I attended a panel about “Reframing Art History Through Digital Approaches.” To me, it was pretty classic DH work. The panel included presentations by Bucknell University Art History students who explored collections in the Packwood House museum in Lewisburg and used Omeka to archive the most interesting collections, and a presentation about digital tools and using them for pedagogy, specifically for art history and teaching resources for art history.
  • After a break, Mackenzie and I went to panel called “Re-Envisioning and Reclaiming History”. This one started off with a great exhibition of a project from Middlebury College. You can watch it for yourself at collinwoodfire.org. It was an incredible multimedia website with an animated movie, newspaper articles, photographs, and other archival data to present the history of the Collinwood School fire in 1908. I highly recommend you check it out, especially the video. The latter half of the presentation was about using digital scholarship in the humanities to engage various audiences outside of academic environments.
  • The conference picked up for me at lunch. Safiya U. Noble, a professor at UCLA, gave the keynote address about “Power, Privilege, and the Imperative to Act in the Digital Age.” Her talk had to do with how digital applications and technology can perpetuate negative stereotypes and serve to reinforce oppressive ideology and history. Professor Noble brought up Google’s search results, the human and environmental impacts of the latest iPhone, and the sustainability of a programmer’s code. Everything was thought-provoking and made a lot of sense to me. It wasn’t necessarily enjoyable to listen to, but it was definitely something that many people needed to hear.
  • Next, Brandon and I went to a workshop that introduced visualizing data on social media such as Instagram and Twitter using websites such as Netlytics and Socioviz. Good to know if I intend to do any data mining for a future project.
  • Our presentation! The other two presentations before ours were also about the role of “Digital Scholarship in Higher Education.” Brandon and Mackenzie discussed and presented the uses of writing as a digital humanities method. Writing for the digital humanities, such as for blogs or to communicate in public (much like how I’m doing now) is very different from writing for an academic setting. I spoke for a bit about my perspective on how DH writing and blogging are different from writing that I’ve done for essays and/or literary analyses in my classes. I also discussed my honors thesis and how I’ll be incorporating DH writing into the notion of a traditional thesis.
  • Cocktail hour – students and professors presented about their various projects with posters and Powerpoints over hors d’oeuvres. I had the pleasure of meeting students from Lafayette College and their professor, and their presentations. And that was it for Saturday.
  • Sunday morning was almost as eye-opening as Professor Noble’s keynote. The first presentation was a little different from the two after. Dr. Heil presented about the pedagogy of large-scale digital scholarship. The next two were more about using digital scholarship to break down borders (the theme of the conference) in ways that may not be obvious. Sandra Nelson discussed the inherent phallogocentricity of programming and coding languages, while Emily McGinn talked about the Western and English-dominated aspects of digital scholarship.

Overall, BUDSC 16 was a worthwhile trip and a great opportunity for me. I thought my part of the presentation would be a bigger part of it for me, but the rest of what I learned and observed was much more valuable than a few minutes of speaking time. There are a couple more conferences this year that I might have the pleasure of attending, and presenting at, and I’m looking forward to them.