Saturday, February 06, 2016

The OA Interviews: Kamila Markram, CEO and Co-Founder of Frontiers

Based in Switzerland, the open access publisher Frontiers was founded in 2007 by Kamila and Henry Markram, who are both neuroscientists at the Swiss Federal Institute of Technology in Lausanne. Henry Markram is also director of the Human Brain Project.
 
Kamila Markram
A researcher-led initiative envisaged as being “by scientists, for scientists” the mission of Frontiers was to create a “community-oriented open access scholarly publisher and social networking platform for researchers.”

To this end, Frontiers has been innovative in a number of ways, most notably with its “collaborative peer review process”. This abjures the traditional hierarchical approach to editorial decisions in favour of reaching “consensual” outcomes. In addition, papers are judged in an “impact-neutral” way: while expected to meet an objective threshold before being publicly validated as a correct scientific contribution, their significance and impact are not assessed.

Frontiers has also experimented with a variety of novel publication formats, created Loop – a “research network” intended to foster and support open science – and pioneered altmetrics before the term had been coined.

Two other important components of the Frontiers’ concept were that it would operate on a non-profit basis (via the Frontiers Research Foundation), and that while it would initially levy article-processing charges (APCs) for publishing papers, this would subsequently be replaced by a sponsored funding model.

This latter goal has yet to be realised. “We dreamed of a zero-cost model, which was probably too idealistic and it was obviously not possible to start that way”, says Kamila Markram below.

Frontiers also quickly concluded that its non-profit status would not allow it to achieve its goals. “We realised early on that we would need more funds to make the vision sustainable and it would not be possible to secure these funds through purely philanthropic means,” explains Markram.

Consequently, in 2008 Frontiers reinvented itself as a for-profit publisher called Frontiers Media SA. It also began looking for additional sources of revenue, including patent royalties – seeking, for instance, to patent its peer review process by means of a controversial business method patent.

The patent strategy was also short-lived. “We abandoned the patent application by not taking any action by the specific deadline given by the patent office and deliberately let it die,” says Markram, adding, “we soon realised that it is far better just to keep innovating than waste one’s time on a patent.” (Henry Markram nevertheless remains an active patent applicant).

By the time the peer review patent had died it was in any case apparent that Frontiers’ pay-to-publish model was working well. In fact, business was booming, and to date Frontiers has published around 41,000 papers by 120,000 authors. It has also recruited 59,000 editors, and currently publishes 54 journals. By 2011 the company had turned “cash positive” (five years after it was founded).

Sunday, January 17, 2016

The OA Interviews: Mikhail Sergeev, Chief Strategy Officer at Russia-based CyberLeninka

Пока рак на горе не свистнет, мужик не перекрестится

Mikhail Sergeev

While open access was not conceivable until the emergence of the Internet (and thus could be viewed as just a natural development of the network) the “OA movement” primarily grew out of a conviction that scholarly publishers have been exploiting the research community, not least by constantly increasing journal subscriptions. It was for this reason that the movement was initially driven by librarians.

OA advocates reasoned that while the research community freely contributes the content in scholarly journals, and freely peer reviews that content, publishers then sell it back to research institutions at ever more extortionate prices, at levels in fact that have made it increasingly difficult for research institutions to provide faculty members with access to all the research they need to do their jobs.

What was required, it was concluded, was for subscription paywalls to be dismantled so that anyone can access all the research they need — i.e. open access. In the process, argued OA advocates, the ability of publishers to overcharge would be removed, and the cost of scholarly publishing would come down accordingly.

But while the movement has persuaded many governments, funders and research institutions that open access is both inevitable and optimal, and should therefore increasingly be made compulsory, publishers have shown themselves to be extremely adept at appropriating OA for their own ends, not least by simply swapping subscription fees for article-processing charges (APCs) without realising any savings for the research community.

This is all too evident in Europe right now. In the UK, for instance, government policy is enabling legacy publishers to migrate to an open access environment with their high profits intact. Indeed, not only are costs not coming down but — as subscription publishers introduce hybrid OA options that enable them to earn both APCs and subscriptions from the same journals (i.e. to “double-dip”) — they are increasing.

Meanwhile, in The Netherlands universities are signing new-style Big Deals that combine both subscription and OA fees. While these are intended to manage the transition to OA in a cost-efficient way, publishers are clearly ensuring that they experience no loss of revenue as a result (although we cannot state that as a fact since the contracts are subject to non-disclosure clauses).

More recently, the German funder Max Planck has begun a campaign intended to engineer a mass “flipping” of legacy journals to OA business models. Again, we can be confident that publishers will not co-operate with any such plan unless they are able to retain their current profit levels.  

It is no surprise, therefore, that many OA advocates have become concerned that the OA project has gone awry.

Wednesday, December 30, 2015

The OA Interviews: Toma Susi, physicist, University of Vienna

Since the birth of the open access movement in 2002, demands for greater openness and transparency in the research process have both grown and broadened. 

Today there are calls not just for OA to research papers, but (amongst other things) to the underlying data, to peer review reports, and to lab notebooks. We have also seen a new term emerge to encompass these different trends: open science.
Toma Susi

In response to these developments, earlier this year the Research Ideas & Outcomes (RIO) Journal was launched. 

RIO’s mission is to open up the entire research cycle — by publishing project proposals, data, methods, workflows, software, project reports and research articles. These will all be made freely available on a single collaborative platform. 

And to complete the picture, RIO uses a transparent, open and public peer-review process. The goal: to “catalyse change in research communication by publishing ideas, proposals and outcomes in order to increase transparency, trust and efficiency of the whole research ecosystem.”

Importantly, RIO is not intended for scientists alone. It is seeking content from all areas of academic research, including science, technology, humanities and the social sciences.

Unsurprisingly perhaps, the first grant proposal made openly available on RIO (on 17th December) was published by a physicist — Finnish-born Toma Susi, who is based at the University of Vienna in Austria.

Susi’s proposal — which has already received funding from the Austrian Science Fund (FWF) — is for a project called “Heteroatom quantum corrals and nanoplasmonics in graphene” (HeQuCoG). This is focused on the controlled manipulation of matter on the scale of atoms.

More specifically, the aim is to “to create atomically precise structures consisting of silicon and phosphorus atoms embedded in the lattice of graphene using a combination of ion implantation, first principles modelling and electron microscopy.”

The research has no specific application in mind but, as Susi points out, if “we are able to control the composition of matter on the atomic scale with such precision, there are bound to be eventual uses for the technology.”

Below Susi answers some questions I put to him about his proposal, and his experience of publishing on RIO.

The interview begins …


RP: Can you start by saying what is new and different about the open access journal RIO, and why that is appealing to you?

TS: Personally, the whole idea of publishing all stages of the research cycle was something even I had not considered could or should be done. However, if one thinks about it objectively, in terms of an optimal way to advance science, it does make perfect sense. At the same time, as a working scientist, I can see how challenging a change of mind-set this will be… which makes me want to do what I can to support the effort. 

Thursday, December 17, 2015

The open access movement slips into closed mode

In October 2003, at a conference held by the Max Planck Society (MPG) and the European Cultural Heritage Online (ECHO) project, a document was drafted that came to be known as the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities.

More than 120 cultural and political organisations from around the world attended and the names of the signatories are openly available here.

Today the Berlin Declaration is held to be one of the keystone events of the open access movement — offering as it did a definition of open access, and calling as it did on all researchers to publish their work in accordance with the open principles outlined in the Declaration.

“In order to realize the vision of a global and accessible representation of knowledge,” the Declaration added, “the future Web has to be sustainable, interactive, and transparent.”

The word transparent is surely important here, and indeed the open access movement (not unsurprisingly) prides itself on openness and transparency. But as with anything that is precious, there is always the danger that openness and transparency can give way to secrecy and opaqueness.

By invitation only


There have been annual follow-up conferences to monitor implementation of the Berlin Declaration since 2003, and these have been held in various parts of the world — in March 2005, for instance, I attended Berlin 3, which that year took place in Southampton (and for which I wrote a report). The majority of these conferences, however, have been held in Germany, with the last two seeing a return to Berlin. This year’s event (Berlin 12) was held on December 8th and 9th at the Seminaris CampusHotel Berlin.

Of course, open access conferences and gatherings are two a penny today. But given its historical importance, the annual Berlin conference is viewed as a significant event in the OA calendar. It was particularly striking, therefore, that this year (unlike most OA conferences, and so far as I am aware all previous Berlin conferences) Berlin 12 was “by invitation only”.

Also unlike other open access conferences, there was no live streaming of Berlin 12, and no press passes were available. And although a Twitter hashtag was available for the conference, this generated very little in the way of tweets, with most in any case coming from people who were not actually present at the conference,  including a tweet from a Max Planck librarian complaining that no MPG librarians had been invited to the conference.

Why it was decided to make Berlin 12 a closed event is not clear. We do however know who gave presentations as the agenda is online, and this indicates that there were 14 presentations, 6 of which were given by German presenters (and 4 of these by Max Planck people). This is a surprising ratio given that the subsequent press release described Berlin 12 as an international conference. There also appears to have been a shortage of women presenters (see here, here, and here).

But who were the 90 delegates who attended the conference? That we do not know. When I emailed the organisers to ask for a copy of the delegate list my question initially fell on deaf ears. After a number of failed attempts, I contacted the Conference Chair Ulrich Pöschl.

Pöschl replied, “In analogy to most if not all of the many scholarly conferences and workshops I have attended, we are not planning a public release of the participants’ list. As usual, the participants of the meeting received a list of the pre-registered participants’ names and affiliations, and there is nothing secret about it. However, I see no basis for releasing the conference participants’ list to non-participants, as we have not asked the participants if they would agree to distributing or publicly listing their names (which is not trivial under German data protection laws; e.g., on the web pages of my institute, I can list my co-workers only if they explicitly agree to it).”

This contrasts, it has to be said, with Berlin 10 (held in South Africa), where the delegate list was made freely available online, and is still there. Moreover, the Berlin 10 delegate list can be sorted by country, by institution and by name. There is also a wealth of information about the conference on the home page here.

We could add that publishing the delegate list for open access conferences appears to be pretty standard practice — see here and here for instance.

However, is Pöschl right to say that there is a specific German problem when it comes to publishing delegate lists? I don’t know, but I note that the delegate list for the annual conference for the Marine Ingredients Organisation (IFFO) (which was held in Berlin in September) can be downloaded here.

Outcome


Transparency aside, what was the outcome of the Berlin 12 meeting? When I asked Pöschl he explained, “As specified in the official news release from the conference, the advice and statements of the participants will be incorporated in the formulation of an ‘Expression of Interest’ that outlines the goal of transforming subscription journals to open access publishing and shall be released in early 2016”.

This points to the fact that the central theme of the conference was the transformation of subscription journals to Open Access, as outlined in a recent white paper by the Max Planck Digital Library. Essentially, the proposal is to “flip” all scholarly journals from a subscription model to an open access one — an approach that some have described as “magical thinking” and/or impractical (see, for instance, here, here and here).

The Expression of Interest will presumably be accompanied by a roadmap outlining how the proposal can be realised. Who will draft this roadmap and who will decide what it contains is not entirely clear. The conference press release says, “The key to this lies in the hands of the scientific institutions and their sponsors”, and as Pöschl told me, the advice and comments of delegates to Berlin 12 will be taken into account in producing the Expression of Interest. If that is right, should we not know exactly who the 90 delegates attending the conference were?

All in all, we must wonder why there was a need for all the secrecy that appears to have surrounded Berlin 12. And given this secrecy, perhaps we should be concerned that there is a danger the open access movement could become some kind of secret society in which a small self-selected group of unknown people make decisions and proposals intended to impact the entire global scholarly communication system?

Either way, what happened to the openness and transparency inherent in the Berlin Declaration?

In the spirit of that transparency I invite all those who attended the Berlin 12 to attach their name below (using the comment functionality), and if they feel so inspired to share their thoughts on whether they feel that open access conferences ought to be held in camera in the way Berlin 12 appears to have been.

Or is it wrong and/or naïve to think that open access implies openness and transparency in the decision making and processes involved in making open access a reality, as well as of research outputs?



Tuesday, December 01, 2015

Open Access, Almost-OA, OA Policies, and Institutional Repositories

Many words have been spilt over the relative merits of green and gold open access (OA). It is not my plan to rehearse these again right now. Rather, I want to explore four aspects of green OA. 

First, I want to discuss how many of the documents indexed in “open” repositories are in fact freely available, rather than on “dark deposit” or otherwise inaccessible. 

Second, I want to look at the so-called eprint request Button, a tool developed to allow readers to obtain copies of items held on dark deposit in repositories. 

Third, I want to look at some aspects of OA polices and the likely success of so-called IDOA policies.

Finally I want to speculate on possible futures for institutional repositories. 

However, I am splitting the text into two. The first two topics are covered in the attached pdf file; the second two will be covered in a follow-up piece I plan to publish at a later date.

To read the first part (a 16-page pdf) please click the link here.

Monday, November 16, 2015

The OA Interviews: ScienceOpen’s Alexander Grossmann

In his time, the founder and president of ScienceOpen, Alexander Grossmann, has sat on both sides of the scholarly publishing table. He started out as a researcher and lecturer, working variously at the Jülich Research Centre, the Max Planck Institute in Munich and the University of Tübingen.
Alexander Grossmann

Then in 2001 he reinvented himself as a publisher, working first at Wiley-Blackwell, and subsequently as managing director at Springer-Verlag GmbH in Vienna, and a vice president at De Gruyter.

An important moment for Grossmann came in 2008, when Springer acquired the open-access publisher BioMed Central from serial entrepreneur Vitek Tracz. Listening to a presentation on the purchase given at a management meeting by the company’s CEO Derk Haank, Grossmann immediately saw the logic of the move, and the imperatives of open access.

However, it was soon apparent to him that the publishing industry at large is not in a hurry to reinvent itself for an OA world, and certainly not if it means having to take hard decisions that could threaten the high profit levels that it has become accustomed to earning from journal publishing.

Speaking to me two years ago Grossmann put it this way: “[T]here is no publishing house which is either able or willing to consider the rigorous change in their business models which would be required to actively pursue an open access publishing concept.” 

And this remains his view today.

In 2013, therefore, Grossmann partnered with Boston-based entrepreneur and software developer Tibor Tscheke to found a for-profit OA venture called ScienceOpen. At the same time he took a post as professor of publishing management at the Leipzig University of Applied Sciences

A Q&A with Alexander can be downloaded as a pdf file here

Sunday, September 20, 2015

The Open Access Interviews: F1000 Founder Vitek Tracz

Vitek Tracz is a hero of the open access movement, and it is not hard to see why. Fifteen years ago he founded the world’s first for-profit OA publisher BioMed Central (BMC), and pioneered pay-to-publish gold OA. Instead of charging readers a downstream subscription fee, BMC levies an upfront article-processing charge, or APC. By doing so it is able to cover its costs at the time of publication, and so make the papers it publishes freely available on the Internet.[See the comment below the Q&A for clarification of this]. 

Many said Tracz’s approach would not work. But despite initial scepticism BMC eventually convinced other publishers that it had a sustainable business model, and so encouraged them to put their toes in the OA waters too. As such, OA advocates believe BMC was vital to the success of open access. As Peter Murray-Rust put it in 2010, “Without Vitek and BMC we would not have open access”.

Today Tracz has a new, more radical, mission, which he is pursuing with F1000.
Vitek Tracz

As always, I have written an introduction to the Q&A below with Vitek Tracz; as sometimes happens, the introduction turned out to be longer than readers might expect, or wish to read.

I have, therefore, put the introduction into a PDF file, which can be accessed by clicking on this link.

Those interested only in the Q&A need simply read on below. 

The Q&A begins ….


RP: As I understand it, F1000 now consists of three main services — F1000Research, F1000Prime, and F1000Workspace. In addition, I believe there is something called F1000 Specialists. Can you say something briefly about each of these services, and when they were launched?

VT: The newly launched F1000 (F1000.com) is an integrated site combining three services: F1000Prime, F1000Research and F1000Workspace.  These services are built and supported through the active collaboration and participation of the largest high-level group of experts (over 11,000 and growing) from across biology and medicine, the F1000 Faculty. This consists of experienced leaders (Faculty Members) and talented young researchers (Associate Faculty Members, appointed by Faculty Members), in about equal numbers.

We started what is now called F1000Prime 13 years ago, which has become the largest and most comprehensive article-level quality assessment of biomedical literature: the F1000 Faculty identify those articles they find interesting in their daily work, rate them at one of the three levels of quality (all positive, the goal is to find the best articles) and write a short text explaining why the chosen article is interesting to them.

F1000Research, launched over 2 years ago, is an open science publishing platform that offers a completely new way of publishing research in biology and medicine: it uses immediate publication followed by transparent peer review, requires the underlying data to be shared, and encourages the publication of all research findings. It also now offers a platform to freely share scientific posters and slides.

Recently, we launched F1000Workspace, a comprehensive set of tools to help researchers write articles and grants, discover literature, manage references and reference libraries, and collaborate and prepare for publication.

The F1000 Specialists are not an external service; they are a growing group of young active supporters of our services who work with us in key institutions to support new users of our services and bring feedback that then contributes to future development decisions.