Operational Experience of an Open-Access, Subscription-Based Mass Spectrometry and Proteomics Facility
This paper discusses the successful adoption of a subscription-based, open-access model of service delivery for a mass spectrometry and proteomics facility. In 2009, the Mass Spectrometry and Proteomics Facility at the University of Melbourne (Australia) moved away from the standard fee for service model of service provision. Instead, the facility adopted a subscription- or membership-based, open-access model of service delivery. For a low fixed yearly cost, users could directly operate the instrumentation but, more importantly, there were no limits on usage other than the necessity to share available instrument time with all other users. All necessary training from platform staff and many of the base reagents were also provided as part of the membership cost. These changes proved to be very successful in terms of financial outcomes for the facility, instrument access and usage, and overall research output. This article describes the systems put in place as well as the overall successes and challenges associated with the operation of a mass spectrometry/proteomics core in this manner.
KeywordsCore Facility Proteomics Facility Managment
The Bio21 Institute is a multidisciplinary research institute attached to the University of Melbourne in Australia. The institute houses approximately 650 staff and students. It houses four of the universities’ major technology platforms: Electron Microscopy, the largest NMR facility in the southern hemisphere, Metabolomics Australia, and the Melbourne Mass Spectrometry and Proteomics (MMSP) Facility. The MMSP is a university-funded core facility designed to provide basic chemical mass spectrometry as well as proteomic mass spectrometry services. Analysis of metabolite samples is handled separately by Metabolomics Australia due to separate federal funding sources.
From its inception in 2006, the MMSP had two core staff (a facility manager and one research assistant) and operated one mass spectrometer (Agilent 1100 nanoLC-CHiP – HCT ion trap). This was followed later, in 2008, by a second system (Agilent 1100 HPLC, 6220 TOF mass spectrometer). The MMSP has offered base services such as gel spot analysis and analysis of chemical synthesis products, operating under a standard fee for service model where users are charged per sample or by hourly rate. By late 2008, the facility had a small but loyal group of approximately 20 clients, but the total income for the facility was only ~$37,000 (Australian dollars). Overall the net research output was poor, and did not justify the financial investment by the University. In conjunction with this, local academic groups wanted access to new high-resolution MS technology, faster access to results, and greater breadth of expertise within the core.
This provided the backdrop for a complete operational change in 2009. Instead of using the standard fee for service/time model of service delivery, the MMSP changed to a completely open, subscription-based mass spectrometry facility. This transformation has been extremely successful as evidenced by a 10-fold increase in income and user numbers. The rest of this article details the method and structures put in place, the advantages and challenges of the approach, and the outcomes of this change.
Instead of operating as a traditional fee-for-service facility whereby each sample is run by facility personnel and charged at a nominal rate, billing was changed to an annual subscription fee or membership fee. It was acknowledged by the institution at the outset that the funding reality for the local academic community meant that it was unlikely that the facility would ever reach a cost neutral or financially profitable position. With this in mind, it was decided to try to build a facility with many users paying a small amount rather than having a service that relied on the few rare users who could pay a realistic cost. Thus, the membership fee was deliberately set at a very low level of $3200 per year per lab for University of Melbourne academic users (higher rates for external academic – $4500, and industry groups – at least triple). Over subsequent years the fee has been incrementally increased at approximately 5% per year. However, where the head of a laboratory pays $3200 per annum, every directly supervised student or researcher under that lab could effectively run as many samples as they needed during their subscription. The only limits were that all users had to share the total available instrument time. No user could monopolize a system or use it in preference to anyone else with any disputes or problems solved at the sole discretion of the facility manager. For small projects, a 3- or 6-month subscription option was also created with the price set so that four 3-mo subscriptions was significantly more than a single year. The intent was to drive users to the single annual fee.
The annual subscription includes all training (provided by the existing facility staff) to perform the necessary experiments. This covers sample preparation, data acquisition, and data analysis for a wide range of services from simple infusion quality control analysis of organic chemical synthesis to proteomic shotgun analysis. In effect, the facility personnel became teachers and the burden of hours needed for sample preparation and data analysis was shifted onto the user base. As a facility, we do not cater to those users who want “full service”. There are many other facilities in Australia that will perform this work and we direct them elsewhere.
As an additional incentive, and as a mechanism to minimize maintenance and troubleshooting problems, an area of the facility lab was set aside for sample preparation. Laminar flow hoods and base reagents required for standard mass analysis/proteomics such as LC/MS grade solvents, tris(2-carboxyethyl)phosphine hydrochloride, triethylamonium bicarbonate, dimethyl labeling, and trypsin were provided free of charge. Importantly, this also included auto sampler vials in order to prevent damage from incorrect vial selection. Expensive reagents such as SILAC, iTRAQ, antibodies, specialist cross-linkers, and similar were not included. Instrument maintenance remained the responsibility of facility staff; however, frequent users were encouraged to learn basic techniques to further develop their knowledge and skills.
Current Configuration and Workflow
As of 2017, the facility now has six mass spectrometers dedicated to facility operations. There are also two additional mass spectrometers owned by co-located research groups and two additional seed instruments under evaluation.
For our standard chemistry users, there is a Thermo Exactive Plus mass spectrometer linked to a Thermo U3000 HPLC pump and auto sampler. The HPLC system is set up for infusion only as a way to simplify sample handling and system control. A series of standard methods are available covering different mass ranges and carrier solvents (water, acetonitrile, or methanol, all with 0.1% formic acid). The instrument methods are all 1 min in duration and operate at maximum resolving power (120,000). The instrument is calibrated weekly and also uses contaminants (214.08963, N-butylbenzenesulfonamide) in the gas supply as a lock mass.
For standard intact protein analysis there is an Agilent 6220 esiTOF with standard electrospray ion source. The instrument is calibrated weekly and the reference ions used as lock mass are present in all methods. This instrument is connected to an Agilent 1100 HPLC, and a small selection of 2.1 mm C4, C8, and C18 columns are supplied. Users are encouraged to purchase their own columns; however, most do not do so. A series of stock methods covering different mass ranges and separation times are in place. More advanced users can alter the base methods (set in the operating system as “Read Only”) and save under their own name.
For the two instruments above, users undergo an initial 45 min training session that explains the health and safety issues, the basics of electrospray, and instrument principals with the assistance of several YouTube videos, information about isotope and charge series, and lastly basic instrument operation. Users are by no means “expert” mass spectrometry practitioners; however, they are able to perform simple experiments, and we encourage them to return several times over the subsequent few days to embed their learning. Once trained the users are able to freely book time on each instrument (http://apps.bio21.unimelb.edu.au/booking/select_resource.php?r_g_id=2); they can consult facility staff whenever assistance is required. There is the capacity to set up specialist methods using different solvents or columns; however, these experiments have to be negotiated with facility management. In many instances, these types of experiments are scheduled out of regular hours to minimize the impact on standard usage.
Proteomic Set-Up and Workflow
The proteomic side of the facility is currently built around three Thermo Orbitrap instruments: Orbitrap Elite ETD, QExactive Plus, and Orbitrap LUMOS ETD. All three instruments have the same Thermo UltiMate U3000 nano UHPLC systems with identical trap elute-column configuration: Acclaim Pepmap nano-trap column (Dionex – C18, 100 Å, 75 μm × 2 cm) and an Acclaim Pepmap RSLC analytical column (Dionex – C18, 100 Å, 75 μm × 50 cm). Each instrument uses the standard Thermo nano electrospray ion source fitted with a stainless steel spray emitter (40 mm, o.d., 1/32 inch, i.d. 30 um, Thermo P/N THIES542). Higher chromatographic performance can be achieved with alternate set-up; however, we have found the above to be the most robust and easily maintained. The instruments are calibrated weekly and use ions in the gas supply as a lock mass (445.120026, dodecamethylcyclohexasiloxane). Proteomic users are given detailed methods for sample preparation and usually perform their first experiments using facility-supplied reagents under the guidance of a staff member. As the user’s competence grows, he/she becomes more independent. For the proteomic user, the most important facility management strategy is that no user has fixed blocks of time allocated to his/her usage. Instead, all proteomic samples are placed in a queue and when time is next available on the appropriate instrument, the samples will be run (normally in order of receipt). Given the fragile nature of nano scale chromatography, any problems with an instrument sample queue inevitably make fixed time blocks unworkable. Our most common problem is a blocked trap column, but the fact that the user loses the result of that sample and gets a bill for a new trap rapidly encourages good sample preparation.
In general, the Orbitrap Elite runs gel spots and other simple digest experiments, the LUMOS takes care of ETD, MS3-based quant, or highest sensitivity requirement experiments, and the QExactive is commonly running data-independent acquisition (DIA) experiments. Samples are reassigned when an instrument requires service or repair (reason for having all three with the same chromatographic set-up). Users can request a specific instrument (commonly for continuity with previous data sets) although the ultimate decision is made by facility staff. There are no hard rules about what type of sample should be run on which machine and wherever practical or sensible, we try to accommodate the client request. Users asking which machine is best for their sample is a common occurrence and staff try to guide them to the appropriate choice. All instruments are given a full cleaning on a quarterly basis with cleaning of the S-lens as required in the intervening time. In between each sample set we normally run at least two 30 min wash gradients (more if required) plus a separate peptide standard run using either 2 fmol Glu-fibrinopeptide (Sigma) or the RePLiCal (PQ-CS-1561) retention time standards from PolyQuant (Germany). A blank at the start of every sample set is used as a control for the presence of any sample carryover. A series of standard acquisition methods are in place but most of our proteomic users are not overly interested in spending the necessary time to become familiar with the instrumentation or its operation. Hence, the data acquisition is usually overseen by facility staff although high frequency users are encouraged to participate in instrument maintenance and set-up. Users wanting to perform experiments using different columns, solvents, or other set-up are accommodated by special arrangement although wherever possible they are directed back to the standard configuration.
For basic proteomic data analysis, a custom online webserver enables users to remotely log in and start their own searches. There is no need for users to return to the lab to collect data. Currently the system has 183 registered users routinely analyzing their own samples. For quantitative analysis samples, the facility has dedicated workstations running multiple software platforms: Skyline, MaxQuant, Proteome Discoverer, Byonic, Spectronaught, and others. Each workstation is adjacent to the desk of a facility staff member so that questions or instruction is immediately available.
Clearly, the facility is still running a significant deficit and it is unlikely in the current funding environment that it will ever be cost neutral. However, the large increase in income has enabled the significant expansion of the facility without placing an increased burden on university finances. To facility managers in the USA, where facilities are often run as cost recovery, the deficit may seem remarkable. The university does not altruistically provide a subsidized service. It is a function of the nature and history of academic funding in Australia. This is far too complex to discuss here, but the outcome is that investigators rarely receive funding that would support the real cost of access to core facilities. As a result, most Australian academic core facilities (of all types) run at a deficit or require additional financial support. For those interested, a recent development has been the Australian Government National Collaborative Research Infrastructure Strategy (NCRIS).
Facility Membership and Usage
Facility Output for Chemistry and Intact Protein
Shifting the majority of the sample preparation and analysis workload onto the facility users has also facilitated a much higher sample throughput and consequent publication output than could be expected from a facility with only four staff. Considering only the “low end” Exactive and Agilent systems used for chemical and protein quality control, these two instruments require minimal oversight from facility staff and yet they generate the largest user group. As an example of usage, the Exactive system was installed in February 2016. As of October 2017, it now has 139 trained users, ran 7304 samples in 2016, and so far has run 19,522 samples in 2017. Similarly, there are 189 trained users for the Agilent system (there is overlap in these numbers as some users are trained on both) and in 2016 it ran 8578 samples.
This large number of samples also translates into a large number of facility associated publications, as the data generated is an essential quality control step in any synthetic chemistry or recombinant protein research project. Facility staffs do not receive authorship although users are requested to acknowledge the facility in any publication. Of the 423 journal articles attributed to the facility approximately 70% used one of the “low end” instruments to confirm the mass of a synthetic chemical product or a recombinant protein (examples [1, 2, 3, 4, 5, 6, 7, 8, 9]).
Facility Proteomic Outcomes
The proteomic aspect of the facility has been growing steadily with the user base. Our proteomic search server currently has 183 registered users who are able to log in and search their own proteomic data. Most of these individuals are only performing simpler proteomic work such as gel spot analysis or immunoprecipitation experiments. However, as confidence and expertise grow, many move on to more complex quantitative experiments. A critical part of running the proteomic side of the facility is the willingness of the facility staff to work openly and cooperatively with the users. It is a tedious task explaining b and y ions or how to use MaxQuant to novice users multiple times a month. You also have to be prepared for regular experimental failure due to inexperience. However, the reward is getting to work with a very broad range of projects while letting the user do the hard work of preparing the biological samples and later crunching through data sets. For new users who want quantitative proteomic data, a common starting point is to take a single sample, split it in two, and see if they can get a 1 to 1 quantitative outcome at the end. Of course, there are users who require more help than others and this is where facility staff are often acknowledged with authorship. This is one of the most difficult aspects to manage; how much assistance is a genuine academic contribution to a paper? In our facility, assisting someone to identify a few gel spots is trivial and not regarded as an academic contribution. Where possible we try to follow the University of Melbourne authorship policy; however, there are occasional individuals/groups who resist appropriate recognition for facility staff. Our general impression is that we have fewer problems in this area than “full fee for service” facilities because we are a “collaborative teaching” lab and have more direct contact with the users. The users appreciate the time spent teaching them how to process their data and can directly see how much effort facility staff are putting into the project. On the quantitative proteomics side, facility users have published papers using SILAC , spectral counting , di-methyl-labeling  multiple reaction monitoring , and label free techniques [14, 15]. Phosphoproteomic and TAILS  publications are currently under review and we have even occasionally (for one special project) branched into metabolomic work [17, 18]. From our publication database, we attribute 135 publications to the proteomic side of the facility (examples [10, 11, 12, 13, 14, 15, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]).
A common query from other facility managers is how often do user prepared samples lead to instrument downtime. Unfortunately, we do not have a direct measure for this or of overall system downtime. This is discussed further below and in the Methods section. The best guide to system downtime is our usage statistics. Since the start of 2013 until October 2017, the facility has run (excluding blanks, washes, and standards) 36,915 proteomic samples (2013, 6886 samples; 2014, 4115 samples; 2015, 9861 samples; 2016, 8908 samples; 2017, 7145 samples). The most common user contributors to downtime are blocked trap columns or polymer contamination. These problems are quickly sorted by facility staff. Otherwise, we suffer the usual hardware issues such as turbo pump failures.
Text mining of our publication list also reveals that since 2009, the MMSP has supported over 100 successfully competitive grants. All the above output has provided a strong basis for equipment funding and the facility has been associated with four successful Major National Equipment grants (Australia Research Council, Linkage Equipment and Infrastructure Fund; LE0882913, LE100100036, LE130100024, LE160100015), which in conjunction with internal university support has facilitated the purchase of five Orbitrap instruments. One of the biggest advantages of a large user base is the very wide range of projects that can be submitted as examples of work to potential funding opportunities.
The Subscription Billing System
The change to a subscription billing system in 2009 was initially greeted with apprehension by a few of the existing academic users. They were only paying $50 per sample and the new system was suggesting they suddenly pay $3200 up front. To overcome this, the total cost that each lab had spent over the prior 2 y on a per sample basis was calculated and demonstrated that they would actually be better off with the subscription rate (another reason for the low initial subscription price).
The subscription billing system has many advantages over the cost per hour/sample model. The biggest benefit is budget certainty of a simple fixed cost. From a facility management perspective, it reduces overall income volatility and allows better forecasting. More importantly, for group heads preparing funding applications, there is a single fixed fee for access and no need for concern about the possibility of cost overrun. This is also a key selling point for facility managers when meeting new clients. Regardless of complications, the need for extra controls, failed experiments, or unexpected samples, the fee is always the same. Uncertain new users can be assured it does not matter if they ‘mess up’ their first few samples. It also means there is no need for the facility manager to produce detailed quotes or to have to renegotiate terms if a project has changed in scope.
The subscription model also removes the need to monitor sample numbers or hours used for billing purposes. There is no need for facility or support staff to be invoicing multiple entities each month for what, in many cases, is trivial amounts of money. A significant scientific benefit was also obtained in that under the fee for sample method the incentive to deliberately not run blanks, washes, or sometimes controls in order to minimize the cost was removed. Fee for sample also discourages the “crazy idea” experiment, and limits the ability of keen students to try new things. Under the subscription model, so long as instrument time is available, users can experiment as they see fit and take their time, where available, to learn more about the technology’s capabilities.
One of the arguments against our specific implementation of a lab subscription model is that large labs tend to get a better deal than smaller labs as they have more users. Therefore, large labs should pay more or be charged per capita. However, experience has been that lab size does not scale directly with overall usage and that requirements fluctuate over the course of a year/project. While some users have hundreds of samples, others have relatively few and, hence, do not make the most of available resources. Under the single fee model users who genuinely have only one or two samples are the most difficult to accommodate. In these instances, the facility staff will simply run the samples free as a gesture of goodwill and demonstration of capability. It is obvious if users try to abuse this. We also point out that “just two gel spots” becomes four when adding controls and then a repeat makes eight and so forth. From a facility sales or pricing perspective, it was important to price the service so that it would be more expensive to do 10 gel spots somewhere else than to do as many as you like in the MMSP. For groups that want large label-free datasets, the fee structure represents an extremely low cost that could be open to abuse. Our experience has been that these data sets tend to come in batches, and that because we require our users to prepare and analyze their own samples, the rate of sample production is generally self-limiting. Additionally, should one user ever try to flood the system with samples, his/her use is tempered by the requirement to share the facility machine time with all other subscribed users; as stated earlier, no user has preferential access rights over any other. Access, for proteomic samples, is provided strictly in the order of sample arrival unless specifically arranged with the facility head. Very large sets may be split over time – although MS1 quantitation experiments make this more difficult. Overall, the above restrictions have resulted in all of our instrumentation being fully booked every day of the year.
Another genuine concern is that one lab will sign up and then run samples on behalf of others. This has occurred; however, it is uncommon and quickly dealt with. Again, because our users need to prepare and analyze their own samples, they may run one or two samples as a favor to someone but they very quickly reject requests for large numbers. In a practical sense, these users end up promoting the facility’s services to others when doing one or two samples. While there are inequities in our one size fits all billing system, the overall simplicity and clarity of its structure outweighs the negative aspects.
Open Access: Novice Users will Break the Equipment
One of the frequent criticisms of regular facilities is that they are insular, uncooperative, or inaccessible. The resentment from local researchers and resulting negative feedback then puts substantial pressure on the facility at an institutional level. The most commonly raised reason for a closed facility is that inexperienced users will break the very expensive instruments. This is a real concern; however, with the right set-up and constant training, most problems are avoided. The primary (obvious) mechanism has been to have multiple machines specifically set aside for particular purposes. The original base set-up was an ESI-TOF MS for the chemistry and intact protein users and a second nano LC/MS system for proteomic experiments. By splitting out the chemistry and intact protein users from the nanoscale users, we avoided the conflict between the dirty users and the need for sensitivity. This evolved over time to an independent machine just for the organic chemists, one machine for intact protein work, and a set of nanoLC proteomics instruments. Aside from sample segregation, the second strategy to minimize problems was to ensure all of the MMSP instruments have standard methods set up by facility staff and, wherever possible, automation is used. As an example, the chemistry users (since 2015) have access to an Orbitrap Exactive with an auto sampler/HPLC system front end that is set-up for infusion – not separation. Users only need to choose a method and give the sample a filename and auto sampler position. The rest is automated. Thus, most of the problems encountered stem from poor sample prep rather than direct “instrument damage”. Poor sample prep has to be addressed with constant education and explicit sample preparation protocols. Although mistakes and carelessness are inevitable, all users want quality data and so rarely make the same mistake twice. For proteomic users, experience has been that most biologists are not interested in direct hands-on use of the instrumentation. They are content to prepare and analyze the data but do not have any interest in troubleshooting nanoLC systems. As such, the majority of problems stem from poor sample preparation.
Overall, open access to instrumentation does increase system maintenance requirements; however; it is not the disaster that many imagine so long as there is ongoing education and well-organized workflows.
Unexpected Benefit of Open Access
All facility managers would be familiar with the problem of users providing unsuitable or incompatible samples for analysis. Samples heavily contaminated with polymer are a common example. This requires the facility to liaise with upstream users and explain, perhaps for a second time, what the problem is and why the experiment failed. Unfortunately this is not always well received by the users who insist that the samples were absolutely “perfect” and that it is clearly a problem with the facility or/and the incompetence of its staff. With the open model the user runs the sample and directly observes the polymer contamination. They gain a direct understanding of why polymer contamination is to be avoided. Most importantly, the facility was not directly involved in the sample preparation so there is a clear chain of responsibility. So long as the instruments are operating correctly (a quick peptide control before every sample set), the user has to own the problem. At worst, the facility failed in its responsibility to teach proper sample preparation, which is mitigated by the detailed protocols that are initially provided to the user.
The Staffing Challenges of Open Access
The most difficult part of operating an open-access, unlimited-use subscription system has been the requirements and demands on the facility staff. The facility is effectively a research-teaching laboratory with all of the repetition, occupational health and safety, and interpersonal challenges that this entails. Facility staffs have to be very knowledgeable, patient, and willing to take the time to teach people and be forgiving of ‘rookie’ mistakes. A key requirement/adjustment for core staff is to regard poor sample preparation or hardware damage by users as a failure of the facility staff to adequately train the user. There are obvious limits to this attitude when it comes to wilful or careless behavior but in general this is rare; the users are spending their own time preparing and running the samples so it is in their own interests to see it done correctly. At the present time, the four full time MMSP staff (three plus one bioinformatician) supervise the mass spectrometry activities of 300+ researchers, which results in a chaotic workload. At the same time, they also have to oversee the maintenance and overall operation of the instrumentation, which (as already described) may require attention due to careless use. Lastly, they need to be familiar and capable of teaching users how to analyze the datasets created in the lab. For all of these reasons, each staff member also has the opportunity to work in a genuinely collaborative academic way with a great variety of projects.
The conversion of the MMSP from a standard fee-for-service facility to an open-access, subscription model has been very successful. Since 2009, the MMSP has grown at a steady pace to the point where the facility now has four staff, nine mass spectrometers (acquired through grant funding), and a stable, albeit negative, budget position that underwrites the facility activities. More importantly from a university investment perspective, the greatly expanded user base of trained clients means that the academic output of the MMSP far exceeds the possible output of a traditional fee-for-service lab that only has four staff. All of these users have been trained to be as independent as possible while working in the MMSP. Not only do users appreciate the ability to get access to instrumentation, but they also learn how to better design their upstream experiments. On multiple occasions, people who started out knowing very little about mass spectrometry have progressed on to later design sophisticated mass spectrometry experiments. All of this has been achieved while maintaining a cost structure that is low enough to enable access to underfunded academic groups. This funding model does rely on a large number of users to pay a small amount, and so the same strategy will not work in isolated locales or anywhere that does not have a critical mass of potential users. However, the administrative and practical advantages of the subscription model still apply, and when combined with the open access model of usage offer tremendous advantages over the standard fee-for-service model. The author is aware of other proteomic/mass spectrometry facilities that have implemented various aspects of subscription billing or open access and would welcome further contact and discussion.
The author thanks Professor Tony Bacic, Dr. Veronica Borrett, and Mr. Manual Zacharias for their support and vision in the initial stages of transition to a subscription-based service; Professor Anthony W. Purcell and Dr. Alun Jones for early advice; and Professor Malcolm McConville, Professor Gavin E. Reid, and Professor Richard O’Hair for ongoing support. Thanks also to all the staff who have contributed to the facility, including Mr. Paul O’Donnell, Dr. David Perkins, Dr. Ching-Seng Ang, Dr. Shuai Nie, and Mr. Sean O’Callaghan. Thanks to Dr. Candice Boyd for her assistance with the drafting of this article.
- 7.Mao, Y., Todorova, N., Zlatic, C.O., Gooley, P.R., Griffin, M.D., Howlett, G.J., Yarovsky, I.: Solution conditions affect the ability of the K30D mutation to prevent amyloid fibril formation by apolipoprotein C-II: insights from experiments and theoretical simulations. Biochemistry. 55, 3815–3824 (2016)CrossRefGoogle Scholar
- 8.House, I.G., House, C.M., Brennan, A.J., Gilan, O., Dawson, M.A., Whisstock, J.C., Law, R.H., Trapani, J.A., Voskoboinik, I.: Regulation of perforin activation and pre-synaptic toxicity through C-terminal glycosylation. EMBO Rep. (2017)Google Scholar
- 11.Kang, Y., Baker M. J., Liem, M., Louber, J., McKenzie, M., Atukorala, I., Ang, C. S., Keerthikumar, S., Mathivanan, S., Stojanovski, D.: Tim29 is a novel subunit of the human TIM22 translocase and is involved in complex assembly and stability. eLife 5, (2016)Google Scholar
- 15.Scott, N.E., Giogha, C., Pollock, G.L., Kennedy, C.L., Webb, A.I., Williamson, N.A., Pearson, J.S., Hartland, E.L.: The bacterial arginine glycosyltransferase effector NleB preferentially modifies Fas-associated death domain protein (FADD). J. Biol. Chem. 292, 17337–17350 (2017)Google Scholar
- 17.Kjer-Nielsen, L., Patel, O., Corbett, A.J., Le Nours, J., Meehan, B., Liu, L., Bhati, M., Chen, Z., Kostenko, L., Reantragoon, R., Williamson, N.A., Purcell, A.W., Dudek, N.L., McConville, M.J., O'Hair, R.A.J., Khairallah, G.N., Godfrey, D.I., Fairlie, D.P., Rossjohn, J., McCluskey, J.: MR1 presents microbial vitamin B metabolites to MAIT cells. Nature. 491, 717–723 (2012)Google Scholar
- 18.Corbett, A.J., Eckle, S.B.G., Birkinshaw, R.W., Liu, L., Patel, O., Mahony, J., Chen, Z., Reantragoon, R., Meehan, B., Cao, H., Williamson, N.A., Strugnell, R.A., Van Sinderen, D., Mak, J.Y.W., Fairlie, D.P., Kjer-Nielsen, L., Rossjohn, J., McCluskey, J.: T-cell activation by transitory neo-antigens derived from distinct microbial pathways. Nature. 509, 361–365 (2014)CrossRefGoogle Scholar
- 19.Safavi-Hemami, H., Siero, W.A., Gorasia, D.G., Young, N.D., MacMillan, D., Williamson, N.A., Purcell, A.W.: Specialization of the venom gland proteome in predatory cone snails reveals functional diversification of the conotoxin biosynthetic pathway. J. Proteome Res. 10, 3904–3919 (2011)CrossRefGoogle Scholar
- 20.Illing, P.T., Vivian, J.P., Dudek, N.L., Kostenko, L., Chen, Z., Bharadwaj, M., Miles, J.J., Kjer-Nielsen, L., Gras, S., Williamson, N.A., Burrows, S.R., Purcell, A.W., Rossjohn, J., McCluskey, J.: Immune self-reactivity triggered by drug-modified HLA-peptide repertoire. Nature. 486, 554–558 (2012)CrossRefGoogle Scholar
- 22.Hossain, M.I., Roulston, C.L., Kamaruddin, M.A., Chu, P.W.Y., Ng, D.C.H., Dusting, G.J., Bjorge, J.D., Williamson, N.A., Fujita, D.J., Cheung, S.N., Chan, T.O., Hill, A.F., Cheng, H.C.: A truncated fragment of Src protein kinase generated by calpain-mediated cleavage is a mediator of neuronal death in excitotoxicity. J. Biol. Chem. 288, 9696–9709 (2013)CrossRefGoogle Scholar
- 23.Pearson, J.S., Giogha, C., Ong, S.Y., Kennedy, C.L., Kelly, M., Robinson, K.S., Lung, T.W.F., Mansell, A., Riedmaier, P., Oates, C.V.L., Zaid, A., Mühlen, S., Crepin, V.F., Marches, O., Ang, C.S., Williamson, N.A., O'Reilly, L.A., Bankovacki, A., Nachbur, U., Infusini, G., Webb, A.I., Silke, J., Strasser, A., Frankel, G., Hartland, E.L.: A type III effector antagonizes death receptor signalling during bacterial gut infection. Nature. 501, 247–251 (2013)CrossRefGoogle Scholar
- 26.Ramdzan, Y.M., Trubetskov, M.M., Ormsby, A.R., Newcombe, E.A., Sui, X., Tobin, M.J., Bongiovanni, M.N., Gras, S.L., Dewson, G., Miller, J.M.L., Finkbeiner, S., Moily, N.S., Niclis, J., Parish, C.L., Purcell, A.W., Baker, M.J., Wilce, J.A., Waris, S., Stojanovski, D., Bocking, T., Ang, C.S., Ascher, D.B., Reid, G.E., Hatters, D.M.: Huntingtin inclusions trigger cellular quiescence, deactivate apoptosis, and lead to delayed necrosis. Cell Rep. 19, 919–927 (2017)CrossRefGoogle Scholar