During the pandemic, politicians, scientists and policymakers have eagerly awaited the release of new data, whether it be coronavirus cases, hospitalisations, or data on the uptake of vaccinations, to inform government’s responses to the pandemic. This period has demonstrated not only our reliance on data, but the value of it.
Scientific research has never been informed by the analysis of such a large, and wide variety, of data. This has been driven, in part, by humanity’s propensity to continually increase our data collection and storage capabilities. It is clear too that effective information gathering and processing are the keys to success in the laboratory. Those making ground-breaking discoveries are often, put bluntly, those using the cleanest, best-annotated data.
What are some of the current issues with information gathering in non-automated labs?
This fundamental area often proves problematic for researchers. Above all, in conventional laboratories, manual data collection is too often the norm, which is a draining and time-consuming process. With mundane tasks of this sort estimated to take up 90 minutes of a scientist’s time each week, a figure that in an analytical lab skyrockets to 80% of all time available. Inputting results manually can also lead to human error and the recording of incorrect data, the consequences of which are sometimes only discovered when it’s far too late to fix.
In addition, the transfer of data is often highly inefficient, with the use of manual curation at one end and manual extraction at the other offering ample opportunity for transcription errors. It’s also time-consuming in its own right. The fact that many laboratories continue to rely on USB sticks for data transfer only exacerbates this issue, and is ultimately representative of a previous era of data management.
That’s not to mention the time requirements involved in transcribing gathered data across different sites, navigating different computer systems, software to collate in central databases and spreadsheets. If implemented poorly, these processes can also cause complications. For example, it was reported that, in October 2020, nearly 16,000 coronavirus cases went unreported in England due to a failure in the automatic process to combine excel spreadsheets. When testing results are inputted manually, human error can lead to incorrect data, the consequences of which are sometimes only discovered when it’s far too late to fix.