In reply to Cobra_Head:
The main logistic problem in covid testing is to get the sample collected properly, not having some serco wanker who has trouble finding his own nose in the dark pointlessly poking around peoples noses and throats.
Once you have the samples, you need to get them to the PCR facility in individual, sealed vials.
These are the rate limiting steps, and unless you want to reuse swabs, you cannot pool at that stage. This is what I meant by obvious reasons for not pooling early.
Once the sample is processed and its RNA extracted you can dilute it, and either make multiple pools with different compositions and combinatorially arrive at a small subset of samples that do need to be retested individually, or to combine parts of the samples into multiple pools whose optimal size depends on the fraction of positives you expect. This could of course be calculated, but in my experience it is not worth the bother.
The first approach requires precise tracking and complicated pooling schemes, a surefire recipe for data labelling or pipetting f*ckups. The second needs quick PCR turnarounds (to allow retesting before the sample degrades.
Neither is going to work reliably when the testing infrastructure is near capacity. It is IMO much safer to keep things simple and increase sample collection capacity, especially for the labour intensitive early steps. Everything down the pipeline is scalable by simply putting another set of machines into the pipeline.
CB
edit: forgot to address your second point: The fact that the UK sent tests to Germany and Italy suggests that there is indeed also a bottleneck at the PCR stage, but for the reasons above I would suggest that this is better addressed by expanding machine capacity than by complicating processes even more in a system that is apparently stretched to breaking point.
Post edited at 21:09