It has become standard practice for researchers to use digital imaging systems for capturing western blot data. Not only does digital detection benefit from a broader dynamic range than film, but it makes sharing results much easier—an especially important consideration now, with so many of us working from home. In this article, we explain how digital imaging systems vary and comment on some recent advances that have increased their utility for western blotting.

A shift toward digital documentation

Film has long been considered the gold standard for western blot imaging, and it is still used by many labs today. However, several factors have driven researchers to switch to digital detection, the latest being the SARS-CoV-2 pandemic. “With many institutions restricting access to shared areas such as dark rooms, those not already using digital imaging systems have been forced to consider alternative ways of working,” comments Jeff Harford, senior product marketing manager at LI-COR. This point is echoed by Brenda Karim, global product manager at Bio-Rad, who notes that sending a digital image to colleagues is both easier and more fitting in the current environment than having them come look at a film. Yet, social distancing is by no means the only reason to go digital—there are many other benefits on offer.

Western blot imaging 
Search Now Search our directory to find the right western blot imaging products for your research needs.

“The ability to quantify accurately is a main advantage of using digital imaging systems,” reports Dr. Martin Biggs, sales manager at Syngene. “Unlike film, which has a narrow linear range and can be easily saturated by moderate to high signal intensities, digital detection has better linearity between the amount of protein and the signal intensity over a broader dynamic range and so provides more quantifiable data.” Dr. Lindsey Kirby, product manager at Syngene, agrees, explaining that the image-capture software that comes with a digital imaging system determines the optimal exposure time for a western blot to ensure both bright and weaker bands are captured without saturation. “With film, several exposures would usually be performed to calculate the optimum exposure time, lengthening workflows and incurring additional costs/waste associated with consumable use,” she says. “Digital detection overcomes this issue and is more environmentally friendly as a result.”

According to Martin Miguez, PharmD., product manager at Azure Biosystems, substantial savings can be made by switching from enhanced chemiluminescent (ECL) detection using film to ECL or fluorescence-based detection with a digital imaging system. “Depending on the number of western blots performed by a laboratory, the savings in photographic film, ECL substrate, elimination of a dark room—and its related expense and safety concerns—would soon compensate for any investment in digital instrumentation,” he says. “Indeed, cutting out certain consumables can save as much as $10 –50 per western blot.” Other reasons for switching to digital detection are that it provides the ability to use fluorescent dyes, and it better aligns with publisher guidelines and best practices, which include the recommendation that data be normalized against a total protein stain. “With chemiluminescence and film, you are restricted to using housekeeping proteins for normalization,” comments Harford. “These are highly susceptible to variation based on experimental conditions.”

western blot

Image: Western blot expenses based on the use of consumables. Assumes a 10 x 10 cm blot. Online prices, February 2020. Data provided by Azure Biosystems.

Different types of digital imaging systems

Digital imaging systems differ in terms of their overall sensitivity and the number of detection modes possible on a single instrument. “Basic systems generally provide only chemiluminescent detection and are available from as little as $5,000,” says Harford. “These are certainly a step up from film, yet most labs prefer to have the option of also detecting fluorescence, which raises the cost of the system according to the number of channels and the optical design.” Miguez adds that a drawback of basic systems is that they may not be suitable for detecting low abundance targets, meaning it can be worth spending a bit more for higher sensitivity and resolution, while Biggs explains that a key differentiator between systems is the combination of camera and lens. “Most basic systems tend to have lower resolution cameras with motorized zoom lenses, while systems designed for higher resolution imaging have larger charged-coupled device (CCD) chips and wider aperture lenses,” he notes. “That said, lower resolution cameras often have much larger pixels, making them superb for fast chemiluminescent capture. These factors should be weighed up during instrument selection.”

Recent advances

There have been several notable advances within the western blotting arena during recent years. Bio-Rad’s stain-free technology has established itself as a popular alternative to Coomassie staining, with the company’s digital imagers evolving in parallel. “Using stain-free technology and a ChemiDoc digital imaging system, researchers can achieve total protein normalization across a wider dynamic range, with greater sensitivity and more reproducible data compared to traditional Coomassie staining,” reports Karim. “And, in contrast to Coomassie-based methods, gels can be used in downstream applications—such as western blot detection after verification of protein transfer.”

Total protein normalization (that avoids the need for Coomassie staining) has also been facilitated by Azure, with TotalStain Q reagent. This is detected in the green fluorescence channel, allowing for multiplexing with chemiluminescent or near-infrared detection, and enables independent quantification of up to three targets simultaneously in the same localization. “TotalStain Q is compatible with any digital imaging system that has a green channel,” explains Miguez. “Moreover, the green channel is an optional add-on for many of our instruments, meaning that researchers who already use our digital imaging systems can easily introduce total protein normalization into their existing workflows should they wish to.”

Other important developments have centered on image-analysis software. “A significant amount of the variability associated with western blotting is tied to the data analysis,” notes Harford. “Traditional ‘signal identification’ software was designed merely to obtain the signal intensity of a band, leading to results being interpreted very differently depending on the analyst. Our [data integrity] software takes operator bias out of the equation by incorporating step-by-step workflows that align with western blot best practices, and has been shown to yield <3% coefficient of variation (CV) when the same data is analyzed by researchers of different skill levels.” Syngene likewise has a strong focus on image-analysis software, with the company’s latest software release including the option to quickly capture fully quantifiable images (with visible and color markers of chemiluminescent blots) by simply closing the door. “As well as providing a robust solution for teaching labs or those that are new to western blotting, an automated approach helps to ensure data is reliable,” says Biggs.