Just weeks before Easter, Dan Cohen's Syllabus Finder--dead since 2009--returned to the net. Sort of. The soul of the popular web tool for education research has transmigrated from its previous incarnation as a search service hosted by George Mason's Center for History and New Media into a new form: a downloadable database comprised of some 1 million syllabi.
Cohen created the Syllabus Finder to work with the Google SOAP Search API back in 2002, and for seven years, the Syllabus Finder's interface put the CHNM web server in touch with the Google web server to answer users' custom syllabus queries by combing the online syllabi of hundreds of educational institutions. Then in 2009, Google deprecated the use of its original API, and the Syllabus Finder was no more--just a sad text box on the CHNM page saying that the Syllabus Finder hoped to return one day re-coded and thanking users, until then, for their patience.
Syllabus Finder, it seemed, had met a fate that is not uncommon among web services--an early forced retirement triggered by compatibility issues. While the web is constantly growing, ever larger and ever deeper, it is also constantly outgrowing formats, versions, and so forth. Thus a web service coded to run with Google's all powerful search winds up with the shelf life of Google's Search API. For Syllabus Finder, that life turned out to be about seven years.
Then on March 30 of this year, an announcement on Cohen's blog made the old news of Syllabus Finder's death if not greatly exaggerated then at least no longer strictly relevant. Syllabus Finder was returning, reborn as the collected results of thousands and thousands of user queries made during the tool's live run. Cohen claims that the Finder actually processed some 1.3 million user requests in its day, so the amassed database is considerable, to say the least, and represents an invaluable data mining opportunity for education researchers interested in big picture pedagogical and bibliographic trends.
Cohen himself has already published an illustrative article using the data set to assess the books assigned in hundreds of American history survey courses, and more interesting studies on how particular subjects were taught in aggregate at the opening of the 21st century are no doubt forthcoming. Kudos to Cohen for recognizing that he was sitting on a mountain of data that was, in fact, a fascinating object of investigation and moreover, for making all of that data publicly available. Google may have left Syllabus Finder behind, but Cohen has given the old program a creative and exciting second life.
Definitive answers may prove a long time in coming, but April 2011 will surely go down as a month that posed some large and essential collegiate questions--about the viability of large-scale, high-quality distance education, about the ability of the traditional university to adapt to a changing technological and economic context, and particularly, about how the answers to those first two questions figure into the future of the University of California system.
Flash back to just shy of a year ago: last May, UC leadership announced an ambitious plan for an online education pilot program that would feature for-credit/towards-degree courses for enrolled undergrads and would, in theory, increase both educational access and system tuition revenues. The cost to launch the program of 20 to 40 new courses over the next two years was estimated at $5-6 million, and to assuage concerns (naturally heightened by a context of public sector belt-tightening), university officials said they expected the program to be entirely funded by external sources. In October 2010, the California Legislative Analyst's Office released a report touting distance learning intiatives as the key means of improving access and "increasing efficiency", and UC vice-provost of academic planning and programs Daniel Greenstein (former director of the California Digital Library) endorsed the report as "educational public policy at its best."
Now jump ahead to the eventful first week of April 2011. At a major college trusteeship conference on the 3rd, UC president Mark Yudof asserted that rather than persisting in a false and ultimately self-defeating "myth" that UC can "cut [its] way to survival", the system would set an example for public universities by commiting to alternate and innovative delivery models largely predicated on distance learning. Later that week, the UC Newsroom followed up with a triumphant announcement that the Gates- and Hewlett-backed Next Generation Learning Challenges program had pledged nearly $750,000 to support the UC distance learning pilot program.
But Vice-Provost Greenstein made another, different announcement the same day: the NGLC grant notwithstanding, securing the external funding that the program required had, in fact, proved difficult--perhaps prohibitively so. Instead, Greenstein explained, the university will be borrowing at least $2 million to get the project up and running by next January, and in order to repay the borrowed money (which could ultimately run several million dollars higher than the initial $2 million), the online education plan has now been expanded to include additional offerings for students not enrolled in the UC system. This reversal on the original external funding plan (even if made by necessity) represents, in effect, a doubling down on the initial ambitious commitment to the distance learning program, and the University of California is positioning itself as both a thought leader and the laboratory host to a high stakes experiment.
In the abstract, this is hardly a bad thing, and most commentators seem to agree that incorporating distance learning into education at an elite level is an idea whose time has come. But the devil is in the details, and the general consensus on need falters at the point of implementation. The week before Yudof made his pronouncements on e-learning at the Association of Governing Boards conference, online education experts from three public universities cautioned an audience at the Sloan Consortium's annual blended learning conference about the long and tough road to profitability that optimistic distance learning intiatives are likely to face. The Sloan speakers' warnings have echoed through the academic/educational blogosphere in recent weeks, and it was against that collective conterpoint that the UC leadership made its string of April announcements.
But careful commentators are noting that the arguments made at the blended learning conference (and elsewhere) weren't against implementing distance education programs; rather, they were against regarding those programs as any sort of quick cash silver bullet for budgetary woes. The argument was pushing the long view and a healthy dose of fiscal patience: all of the evidence on e-learning suggests that the initial investment in technical infrastructure and course development will significantly outweigh early revenues. If, as Yudof contended, the university can't hope to cut its way to survival, neither can it hope to tap an easy flowing river of tuition money coursing lazily through cyberspace.
The big questions coming out of the developments of April, then, hinge on implementation. These questions abound and range from the theoretical (What is lost/gained by leaving the brick and mortar classroom for the web?) to the technical (How will student engagement be facilitated?) and of course, the financial (How will instructor compensation for virtual courses compare with that of traditional teaching?).
With its significant borrowed outlay, UC is setting itself up to explore these questions as a pioneer in the field of large-scale, high-level field of e-learning, but pioneering is a fraught and high stakes business. No one wants to see the UC distance learning program follow the failures of NYUonline and U. of Illinois's Global Campus, but as yet, it's not clear how UC leadership plan to ensure the success and sustainability of their program where others have often underperformed and disappointed.
The one thing that is clear is that the University of California will have the stage and that the whole higher ed world will be watching.
In recent months, we’ve been following the progress of Open Access, as it creeps along in this decade-long process of developing from an experimental initiative at a couple of campuses—most notably, MIT—into a veritable movement.
The momentum of the growing movement is demonstrated by organizations like Right to Research, a coalition of student associations representing more than 5.5 million members worldwide. Its glossy new website makes it easy for students, professors, and librarians to get involved and voice their support for Open Access. The Scholarly Publishing and Academic Resources Coalition (SPARC), shares a similar commitment to outreach, and its popular Sparky Awards, which showcase student videos that call for Open Access, are now in their fourth year.
Three recent news items also bode well for the future of Open Access and underscore its widespread support. First: On January 20, the U.S. Department of Labor, in coordination with the U.S. Education Department, issued guidelines for a $2-billion grant program aimed to help community colleges create, expand, or restructure career-training programs. The program is actively encouraging the development of course materials and learning environments openly available online. In an email to The Chronicle, U.S. Education Secretary Anne Duncan writes: “With $500-million available this year, this is easily one of the largest federal investments in open educational resources in history.”
Next: On January 12, MIT added a new section to its pioneering OpenCourseWare program: OCW Scholar, supported by a $2-million grant from the Stanton Foundation. The new initiative is specifically aimed to help “self-learners”—or perhaps community learners like those at P2PU—benefit from the course materials available online.
A third news item relates not to Open Access course materials, but to scholarly publications. A report published by Study of Open Access Publishing (SOAP), an international team funded by the European Commission, traces the path from widespread support of Open Access to surprisingly scarce Open Access publications--SOAP claims that the number articles published in Open Access journals in 2009 represented only 8-10% of the estimated yearly global scientific output.
According to the SOAP report, almost 90% of more than 40,000 scholars across the natural sciences, humanities, and social sciences surveyed believe that Open Access journals are good for both the research community and the individual researcher. In a Chronicle post dedicated to the study, Josh Fischman notes that support for Open Access is highest amongst scholars in the humanities, with more than 90% responding favorably. Despite this widespread support, 29% of the scholars surveyed said they had not published an Open Access journal. The two most popular reasons given were the high publishing fees, followed by the perception that Open Access journals are not of good quality.
While it may take some time for Open Access journals to achieve the prestige of their established print counterparts, or for established print journals to develop sustainable plans to make their content freely available online, there are other ways to ensure the free online distribution of scholarly research: institutional programs, like the Open Access resolution recently adopted by Oberlin College, which ask scholars not to publish only in Open Access journals, but rather to assist in the creation of an institutional digital repository of work by affiliated scholars.
Last fall, the Townsend Humanities Lab published a series of posts in this space on issues relating to Open Access and academic e-publishing, and in November of 2010, the Lab hosted a lunch discussion forum on web-based alternatives to the traditional peer review system. Today, we proudly present a guest blog by Sebastiaan Faber (Oberlin Professor and Chair of Hispanic Studies), who provides insight into Oberlin College's decision to commit to Open Access.
In the fall of 2009, the faculty of Oberlin College unanimously adopted an Open Access resolution. This resolution included a double commitment. First, the faculty pledged to consistently report its scholarly publications to the institution. Second, it committed to make the full text of those publications, particularly peer-reviewed articles, openly accessible to the entire world, through a digital institutional repository.
With this resolution, Oberlin joined a growing number of institutions of higher education in committing itself to open access, following the lead of major universities such as Harvard, MIT, and Stanford. Although the principles of open access are straightforward and the Oberlin faculty agreed from the outset that they are indeed laudable, the unanimous faculty vote was the culmination of a year’s worth of debate and deliberation.
Most everyone agreed from the start that making our scholarly work as widely available as possible, and to do so for free, was a good idea—for three main reasons. First, it is good for scholarship to be openly accessible. The current, for-profit system of dissemination does not work well in that respect. It is also bleeding library budgets everywhere. Second, the internet makes open access easier and cheaper than ever. And third, Open Access will make it easier for Oberlin College to showcase to the outside world, but also to itself, its intellectual and scholarly engagement and productivity.
The challenge was to go from principle to practice. How do you translate a laudable goal into the messy and diverse world of scholarly publishing across fields, divisions, and generations? The discussion quickly ran into a host of complicating factors, of which I’ll mention three.
A first issue had to do with the way that faculty conceive of their relationship to their institution. As it turns out, we tend not to see ourselves as employees of a college or university; we are affiliated with it. We do our teaching for the institution, but we hold a much more diverse set of assumptions about the relationship between our institution and the scholarship that we produce. Sure, we tend add our affiliation to our byline; but we don’t see our scholarship as originating in, much less belonging to our college or university.
A second complicating factor was faculty’s relationship to journals, publishers, scholarly associations, and funding sources. Depending on their field, the way in which faculty conceive of their relationship to the other major players in the field of scholarly publication differs widely. Scholars may feel beholden to funders, to publishers, to scholarly associations that publish journals.
A third point of contention was related to career pressures: faculty fear of rocking the boat, missing out, disqualifying themselves. For most faculty, publishing is both absolutely vital and seen as somewhat precarious.
Oberlin’s Library Committee, which took on the challenge of shepherding Open Access to a faculty vote, decided for a gradual, step-by-step approach that would provide ample opportunity for questions, feedback and discussion. This included, crucially, educating our colleagues about their rights as authors and the ease with which they generally give up many of their rights. We further organized workshops with invited speakers, and held separate discussions within the College, Conservatory, and academic divisions and departments. In addition to educating the faculty about the severe problems associated with the current system of scholarly publishing, these sessions served to dispel fears and misinformation. Some faculty were surprised to hear that a large number of journals do in fact allow for some form of author archiving. Others were relieved to hear that a mere inquiry from academic authors about a publisher’s policy on institutional repositories has never yet resulted in a penalization of the author. Of indispensable help in all these discussions was Ray English, Oberlin’s Director of Libraries, who has long been a national advocate for Open Access.
The discussions also served to make up our collective mind on the policy’s actual nature and scope. In the end, Oberlin’s Open Access Policy was largely modeled on Harvard’s. Like Harvard’s, ours is an opt-out policy: all faculty and staff make their scholarship accessible as a matter of course, while anyone can request an exemption at any time, for any publication, for any reason—provided they first report their publication’s metadata. In other words, even if the repository does not end up containing everyone’s work in full-text, it will still reflect the full range of Oberlin’s scholarly productivity. Unlike Harvard’s, moreover, Oberlin’s policy also covers chapters in collected volumes—an important publishing format for many humanists and social scientists.
Since the policy was adopted, library staff have been working on the set-up of the actual repository, modeling it on Harvard’s exemplary DASH (Digital Access to Scholarship at Harvard). Once the repository is in place, it will probably take a while for the faculty to get in the habit of reporting and submitting their publications; to help things along, the administration is devising a way to fold Open Access into the faculty’s regular reporting for purposes of promotion and salary review. The library has also committed to populating the repository with Oberlin scholarship published before the resolution was adopted. The office of Communications, meanwhile, is eagerly awaiting the moment when it can mine the repository for its purposes—showing the world that Oberlin is a vibrant intellectual community of engaging scholar-teachers who are not afraid to share their work with a global public.
We spend a lot of time in this space surveying the current landscape of the digital humanities and perhaps even more time peering into the future of the field and asking ourselves, "What's next?". On the whole, that's probably not a bad approach to an area of scholarship that seems both so full of potential and, at the same time, so aware of its potential and thus, rather permanently fixated on the "new" and the "next".
But in this week's blog, we will leave--for the moment--the bleeding edge and its discontents and address a group that seems to be increasingly left by the wayside as the digital humanities plunge ahead into the perpetual next. We want to take a moment to highlight a recently launched website that caters to the uninitiated and the curious, the new graduate students and the old traditionalists, the still-great majority of humanists typified by the scholar who, as one of the site's creators puts it, "doesn’t know what XML stands for, has only vaguely heard of Zotero, and is puzzled as to how Twitter would ever be useful for an historian."
Despite it's tongue-in-cheek title, Stanford's Humanities 3.0: Tooling Up for Digital Humanities site is not a new techier-than-thou desitination site for the digital humanities avant garde. On the contrary, the site was devised with an eye to inclusion and has been designed as "a starting place, an entryway for scholars interested in beginning to explore the possibilities for digital tools, programs, and methods to empower and enhance their scholarship in the humanities."
Tooling Up, the product of a collaboration between the Spatial History Project and the Computer Graphics Lab at Stanford and the direction of historian Jon Christensen, is organized as a series of discussion-enabled "chapters" that, taken together, are meant to serve as something of a primer on the digital humanist's methods and tools. The first two installments explore the creation and maintenance of a virtual identity and the essentials of working with digital archives. Absent the glitz and thin air of the excellent but not necessarily instructive Humanities 2.0 series currently running in the New York TImes, the new Stanford initiative--if it continues to be well-executed--stands to address a real need that exists outside of the DH community while also promoting the flow of vital new blood into that community as it continues to march ahead.
There has been no shortage of posts in this space devoted to discussions of open access educational resources and the possible future(s) of web-based academic publishing and research. A recent study detailed in a paper appearing in this month's Journal of Electronic Publishing peers into that digital future and finds..."academic search engine spam"--a neologism that is strange, funny, and just may be portentous.
Authors Joeran Beel and Bela Gipp have been studying Google Scholar's ranking algorithm for some time, and the duo published an article earlier this year investigating the possibilities of what they called "academic search engine optimization" (ASEO). In that paper (which is also well-worth reading), they advised scholars on “[...] the creation, publication, and modification of scholarly literature in a way that makes it easier for academic search engines to both crawl it and index it.” Not surprisingly, the paper and the very idea of ASEO sparked a controversy in the academic community.
In the opening of their new article, Beel and Gipp note that the ASEO study elicited a varied response, and the new study promises to do likewise. The authors followed up on the guidelines they laid out in the ASEO paper to test the degree to which the ranked results of academic search engines (primarily focusing on Google Scholar) can be manipulated through altered citation counts, keyword padding, and the inclusion of invisible text within academic papers. And in what has to be a first in academic research, they even manage to get Google Scholar to link to a doctored version of a research paper containing an ad for Viagra.
Accordingly, Beel and Gipp find that academic search engines can be gamed and that it isn't even terribly difficult to do so. The threat, in other words, is real, and in their concluding discussion, the authors recommend that Google Scholar and other engines "should apply at least the common spam detection techniques known from Web spam detection, analyze text for sense-making, and not count all citations." More provocatively, they aver that "the potential benefits of academic search engine spam might be too tempting for some researchers." In a paper likely to spark considerable discussion, that's the sentence that will provide the most tender.
Interestingly, Beel and Gipp are at work on their own, hopefully robust academic search engine, Sciplore. And in case you were wondering: this author can confirm that all of the keywords for the pair's recent article are indeed legit.
Last Monday marked the kick-off of the fourth annual Open Access Week, a global event that is roughly equal parts teach-in, networking opportunity, and digital barn-raising for the OA community. Bringing together students, teachers, researchers, and technologists who advocate for free and fast access to scholarly information, the event--or rather, web of events--runs from October 18-24.
As per the OA Week website, the community’s core idea is the promotion of "free, immediate, online access to the results of scholarly research and the right to use and re-use those results as [needed]." The standard argument, then, for open access is that it promises to maximize the benefits and exposure of published research, to facilitate scholarship across traditional institutional and disciplinary lines and over larger bodies of literature, and to generally deliver on the promise of electronic publishing for academic materials. In other words, the OA movement is founded on the idea that (scholarly) information wants to be free, and as the recent growth of the movement attests, that idea is finding plenty of adherents throughout the academic and research worlds.
Monday 3/8: Catherine Mitchell, CDL Publishing Director, Speaks About Open Access Humanities Publishing at UC
The Townsend Center for the Humanities is excited to announce the second lunch forum in our spring series on Digital Technology in Humanities Scholarship. Please join the conversation on Monday, March 8th about Open Access Humanities Publishing with Catherine Mitchell. Director of the California Digital Library's Publishing Group, Mitchell will discuss the growing momentum around open access publishing in the humanities at the University of California.
In particular, her talk will focus on the suite of open access publishing services that the CDL’s eScholarship program provides to the UC community. Exciting and innovative new publishing models are emerging from this space, including the UC Publishing Services, which is a joint program of the CDL and UC Press. Ms. Mitchell will also consider the role of the institution in supporting the lifecycle of research and publishing – and the emergent potential and challenges for humanities publishing in digital media.
Mitchell's short talk on the subject will be followed by audience Q&A, and a general discussion of the issues. This is a brown bag lunch event, with light refreshments provided. The final Spring lunch forum in the series will be on April 15.
Time and Location:
Monday, March 8, 12-1PM, in the Geballe room at the Townsend Center, 220 Stevens Hall
About the speaker:
Director, CDL Publishing Group
Catherine Mitchell is responsible for overseeing the strategic planning, development, and operational management of eScholarship's publishing services. Prior to joining the eScholarship team in 2006, Catherine served as the Web Director of the Commonwealth Club of California. She holds an A.B. in English from the University of Chicago and a Ph.D. in English Literature from the University of California, Berkeley.