From Lab to Fab with Automation

Automation is pivotal in transitioning from laboratory prototypes to full-scale fabrication, especially in the nascent field of quantum computing and chip production. By integrating advanced technologies such as robotics, AI, and control systems, automation enhances production efficiency, output quality and ensures scalability.

As demand for quantum technologies grows, the strategic implementation of automation enables manufacturers to scale operations from experimental stages to stable production (“lab to fab”). This shift is crucial for maintaining cost-efficiency and quality when trying to move away from hero devices towards optimizing for yield.

Calibration graph 5 qubits

The visualization of a dependency graph for a 5 transmon qubit chip. This example compresses various functions into fewer nodes to perform a two-qubit gate tune up with less than 40 steps for each qubit.


While automation always works well in theory, the reality of manufacturing is often complex and difficult to model. One of the main issues when automating lies in dealing with errors which inevitably occur during production processes. For example, in the automated testing of quantum chips, corner cases such as failing to connect to a qubit or encountering a resonator that is significantly off its design frequency can disrupt the entire sequence of protocols. The challenge is not just automating processes but ensuring robust automation that allows for efficient human intervention when needed.

Bill Gates in Forbes: The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency.

To address the complex dependencies between consecutive steps in automated tune up of superconducting qubits, we introduced dependency graphs early on. A Directed Acyclic Graph (DAG) can model calibration protocols as a node, while edges in the graph represent the dependencies in between. Each calibration function features an intermediate analysis step, so that we can assess the success of individual or groups of measurements and integrate error mitigation strategies. The robustness depends on the amount of corner cases you can correctly mitigate in an automated fashion. By making it necessary for all dependency nodes to be passing before executing the next node, it becomes easy to trace back the cause of failure, which reduces the effort in human debugging.

The DAG framework, part of GRACE in our Quantum Diagnostics Libraries allows our test equipment to build a complex calibration protocol with many dependencies out of simple building blocks that can be intuitively visualized. By allowing for an assessment strategy of a node failed or not, we build automation bottom-up rather than trying to superimpose frameworks. As Bill Gates put it; getting the fundamental operation to work is a necessary condition for efficient automation.

In a chip production environment specifically, the test time is one of the biggest manufacturing costs for a company that is trying to produce products in high volume. For cases where cryogenic test environments are needed, this might hold true even for small volumes. In our test equipment, we are using a variety of test time reduction techniques that help us to minimize the time it takes to test a large quantity of qubits. Automated diagnostics was one of the first steps we took, since it was able to reduce the test time by about half, as compared to standardized, but manual testing routines.

Automation and quality are mutually reinforcing each other: installing automated high-throughput test equipment yields viable data that enables the optimization of the complex fabrication process. Simultaneously, the improved device quality and reduced variability enables much deeper automated measurements.