Three researchers who went out on a limb to bridge a gap in their field talk to Nature about how and why they went about designing their own, unique devices — and the challenges involved

If you want something done right, do it yourself: the scientists who build their own tools

Industry doesn’t always make the intricate tools that research demands, pushing some scientists to take DIY to the extreme.Credit: NPL

In science, as any researcher knows, the right tools can be the crucial difference between making a discovery and wasting time. But what happens when the right tools simply don’t exist?

Against a backdrop of constrained budgets and a market that sometimes fails to meet the nuanced demands of cutting-edge research, many scientists are becoming makeshift engineers. These researchers are designing and programming machines to fill a gap in their own research needs, often from scratch, and with little expertise in engineering.

The driving force behind the shift is the practical reality that off-the-shelf tools often fall short: they either cost far too much or they cannot do exactly what is needed. This is especially true in relatively new lines of research, such as the study of nanoplastics, for which funders and manufacturers are slow to keep up.

Here, three scientists talk to Nature about how and why they build their own devices, the skills required and the challenges they face along the way.

Erika Debenedictis works in her lab at the Francis Crick Institute in London, using viruses to manufacture proteins.Credit: Erika DeBenedictis

ERIKA DEBENEDICTIS: Virus creche

Computational physicist and synthetic biologist at the Francis Crick Institute in London.

One of the most powerful tools we have for engineering proteins is evolution. In the same way that you can breed dogs to sniff out cancer or cultivate roses to be a certain colour, you can direct evolution on the molecular level — which is useful for discovering proteins.

During my PhD in protein engineering at the Massachusetts Institute of Technology (MIT) in Cambridge, I worked with a technique called PACE (phage-assisted continuous evolution), a fast method of exploiting viral evolution to engineer proteins. We used an M13 bacteriophage, which is a virus that can infect bacteria and make baby viruses in just 20 minutes.

The problem is that evolution is unreliable: sometimes living things go extinct because they evolve too slowly to keep up with changes in the environment. When using it to engineer proteins, researchers are presented with a ‘Goldilocks’ problem — they need to present the ecosystem with just the right amount of challenge to push the viruses to evolve without killing them off.

During my PhD, I designed a robot to babysit the evolution of my proteins. PRANCE — phage- and robotics-assisted near-continuous evolution — can monitor hundreds of evolving populations at the same time. If it notices that a population is about to go extinct, it will make the environment a little easier to tolerate — and so the hit rate of making a better protein goes from 0 to 100.

The instrument uses basic robotics to do something super cool. It’s a liquid-handling robot, so it uses a pipette to transport substances. It also uses a plate reader to measure the luminescence of microbe populations to see whether they’re thriving or about to go extinct.

Robots might not be as smart as humans, but they have stamina. This machine can pipette all day, every day, without sleeping. I think biologists tend to like things that are complicated, but simpler is often better. I built two of these machines at MIT, and I have another for my work here at the Francis Crick Institute, complete with a centrifuge and other custom additions.

The robot is patented, but its custom design brings challenges and building more versions at scale is hard. There are a lot of spinning plates to contend with — and it takes real elbow grease to get it right.

There’s a big divide between home-made research instruments and polished tabletop devices that you can simply buy, plug in and press a button. Making a second PRANCE robot was a real maturation process: we discovered all the dependencies we didn’t know mattered. The aim is now to squash all the bugs one by one — it gets easier every time we make a new instrument.

Inventions come from going out on a limb. The first version might be duct-taped together, but then you make another one, and another one, and the hope is that eventually the pay-off is greater than the pain of putting it together.

I love working towards things that are still unsolved. This instrument is so interesting that it has prompted the generation of two academic labs and a start-up company. Protein engineering has already brought about the development of cold-tolerant enzymes, which allow us to wash our clothes at lower temperatures, for example. I have also worked with proteins designed to act as powerful therapeutics for COVID-19 to make them more stable than those ordinarily produced by antibodies, and give them a better shelf life.

As protein engineering blossoms and becomes more predictable, I’m excited to imagine what else can be achieved through it. There are countless possibilities, from more efficient chemicals to new drugs.

Tunnel-scanning robots bring modern methods to keep Britain’s Victorian railways on track.Credit: Mott McDonald

NICK MCCORMICK: Tunnel vision

Principal scientist at the National Physical Laboratory (NPL) in Teddington, the UK public institute for metrology.

At the NPL , our main role is to research, develop and maintain standards of measurement. My area of expertise concerns measuring structures, and I’m working on a project with Network Rail, the UK railways operator, to upgrade the monitoring of the country’s train tunnels.

The British rail network is an impressive feat of Victorian engineering — the first of its kind — built mostly between 1830 and 1870. There are now around 700 tunnels covering some 320 kilometres. Of those tunnels, about 80% were built with Victorian masonry. Furthermore, the network includes more than 30,000 walls, bridges and viaducts — all of which need examining for safety. Given their age, it’s clearly important to monitor the structures from a safety perspective, but it’s also important to keep the network running smoothly. If one tunnel has a problem, it affects the rest of the line, causing delays or closures.

Until now, tunnels have been inspected using nineteenth-century techniques — that is, relying on the human eye. Network Rail employs people to walk through the tunnel with a torch, looking at the interior, occasionally tapping it with a stick to see if any materials feel loose or start to crumble. It’s a difficult and dangerous working environment; the tunnels are dirty and often more than 1 kilometre long. Trains must be stopped for the inspection to take place, which often means working between 2 a.m. and 4 a.m. on a Sunday to avoid causing delays for passengers.

We were commissioned by Network Rail to develop a more reliable way of measuring and maintaining tunnel interiors. Our machine, DIFCAM Evolution, is the result. DIFCAM is a high-resolution camera system designed to map tunnel interiors remotely. The equipment is based around a very sophisticated stereo-line scan camera that captures full-colour images and can also generate height maps. It can be attached to trains and can work while travelling up to 100 kilometres per hour. It scans the surfaces of tunnels, taking thousands of pictures that allow us to assess the shape and texture of the tunnel.

Ideally, we would like to be able to identify subsurface defects, too, but that’s very difficult to do at any reasonable speed. We looked at a lot of potential techniques, some quite fanciful — for example, firing ice cubes at the surface of a tunnel and analysing the sounds produced. But DIFCAM provides us with a practical means of recording data in a way that is methodical and reproducible.

The machine allows us to identify nearly all the defects we are interested in monitoring. For example, we can detect a brick that is in danger of falling from the interior of a tunnel from tiny changes in tunnel height and mortar depth. Comparing images over time enables us to monitor the rate of change, predict whether a piece of masonry is going to fall imminently and determine if the surrounding mortar is still adequate.

We use digital image correlation — comparing two images at a very high magnification, on a pixel-by-pixel level — to see how the structure is changing.

The biggest challenges are reliability and repeatability. When going through a tunnel, you can’t use GPS to measure the location, so we use an encoder on the wheel of the vehicle to trigger the camera, which builds up the images one line at a time. We also use a special camera to measure the position of the device along the train-track bed to improve the reliability of the measurements.

Another challenge is the amount of data generated. The camera captures images every 0.2 millimetres — generating about 6 gigabytes of data per metre. Because tunnels can be hundreds, if not thousands, of metres long, we reprocess some of those data to about one pixel per millimetre to make them tractable.

There are all sorts of things that you can do with the image data. The height-comparison images can also be used to estimate how much material might be needed for repairs, and then to assess whether the repair was done correctly. Crucially, the imaging allows us to remove human subjectivity from the equation.

Custom-made equipment has shown that microplastics are found in snow samples take from 3,000 metres above sea level.Credit: Dusan Materic

DUŠAN MATERIĆ: Plastic fingerprints

Environmental analytical scientist at the Helmholtz Centre for Environmental Research in Leipzig, Germany.

In 2017, I was working at the Sonnblick Observatory in Austria, more than 3,000 metres above sea level in the alps, collecting and analysing snow samples. It was part of a research project looking at organic matter from the past, such as in ice cores: my aim was to discover how changes in the atmosphere had affected aerosols and organic matter, including those coming from forests and fires. For accuracy, it was important to seek out remote places where the snow is intact and untouched, which is what brought us to Sonnblick.

We evaporate the water from our samples and then burn the residue and analyse the vapours. What we found shocked me: we found evidence everywhere of plastic nanoparticles — smaller than 200 nanometres in size, and mostly polypropylene and polyethylene terephthalate. It made me realize that I needed to develop a reliable informatics system that could analyse these plastic particles more accurately. The result was a machine based on a method known as TD-PTR-MS (thermal desorption — proton transfer reaction — mass spectrometry), which is used to measure volatile organic compounds.

The prevalence of microplastics is well established, but mostly in oceans, and it is generally measured using basic methods, such as nets with holes around 300 micrometres in diameter. We’ve only recently started to realize that plastics fragment into smaller and smaller particles, and that these can be transported through the air. Identifying nano-sized particles with spectroscopy is challenging, because the wavelengths used in the devices are too large for the particle sizes, and the lasers cause too much reflection.

To develop my machine, I buried my head in coding for three weeks. The machine is both a physical device and a piece of software. The physical component uses a vacuum to remove water from the sample without contaminating it. The water is then transferred to another system I developed — effectively a black box that controls temperature like a very small oven does. I needed to design the box such that it could communicate with the TD-PTR-MS instrument to analyse the samples and fingerprint the plastics found in them — hence the need for tailor-made software, which I built from scratch using open-source materials.

We developed a couple of these systems at Utrecht University in the Netherlands, where I was working at the time, and I also have one here at the Helmholtz Centre for Environmental Research in Leipzig. One of the biggest challenges of this machine is that it produces a lot of data. It also takes some training to know how to run it: it’s not just pressing a button. Understanding the mathematics is vital.

I’m glad I persisted with building this — it was a risk, but it paid off. I now have approval to bring in PhD students, and we’re expanding the project to study nanoplastics in urban and rural residential spaces, as well as lakes and oceans. This device has helped to expand the nano-plastics research field and our understanding of the problem. I’m now working on ways to pair up nanoplastics analysis with realistic toxicological studies, which will help to focus mitigation efforts.

doi: https://doi.org/10.1038/d41586-023-04015-0

This story originally appeared on: Nature - Author:Rachael Pells