One in 12 of the male population globally (or the 1 in 200 females) suffers from colour blindness. The level of frustration felt by these people stems from their inability or decreased ability to see colour, or perceive colour differences, under normal lighting conditions. One example of the problem caused by colour blindness is the difficulty in getting the optimal picture quality when watching TV.

Now, a Cambridge, UK-based spinoff company, Spectral Edge Ltd, established to further develop, exploit and commercialise image fusion technology devised at the Colour and Vision Group of the School of Computing Sciences at the University of East Anglia (UEA), Norwich, has revealed that its “Eyeteq” technology has been integrated into a set-top box (STB), and said the deployment took just a few weeks. The idea is that the Eyeteq technology can be offered as an option in the STB’s accessibility menu.

Spectral Edge managing director Christopher Cytera would not name the STB maker, but said, “It is a major, medium size, international player in the market.”

He revealed the next step is a large-scale trial “in collaboration with a well-known, independent research group” to prove that the Eyeteq technology is superior to previously tried approaches to improving images for people suffering from colour blindness. “There has been a lot of effort to improve accessibility for other groups, such as those hard of hearing, with great advances in sub-titling, but little progress in this area.”

He added the company is “gearing up to license the image processing technology to STB and TV makers.”

Spectral Edge uses Eyetec, a mathematical image processing technology from the University of East Anglia, to enhance colour rendering to help people with colour blindness see more colour. Above, a simulation of the enhanced rendering. (Source: Spectral Edge)

Content streamed to the STB incorporating the technology is enhanced on a frame-by-frame basis before transmission to the TV screen. Colour-blind viewers can then significantly better differentiate between red and green when watching, allowing them to see details that previously they could not. Importantly, Cytera said, the enhancement has minimal impact on the quality of the picture seen by viewers not suffering from colour blindness.

The technology is said to use mathematical perception models to modify image colours, and is suitable for both still and moving images. The company suggests colour-blind viewers watching programs that normally contain a lot of red and green in their images, for instance sports and nature, will enjoy the biggest improvements in the “viewing experience.”

Image fusion technology is based on the concept of sensing and combining many different images, representing a range of the electromagnetic spectrum, into a single display image. It has applications in several other fields including medical imaging, automotive imaging, satellite imaging and surveillance.

The team at the UEA’s “Colour Lab,” led by professor Graham Finlayson, chief scientific officer and co-founder of Spectral Edge, has devised a platform technology and initially focused on colour deficiency in vision.

The company was spun off in March and raised about $470,000 from the Midven Rainbow Seed Fund and the UEA’s Iceni Seed-Corn Fund. Subsequently, “We have gained significant grants from the UK Government’s Technology Strategy Board (set up to help establish and support entrepreneurial companies), that should see us through to the next phase and ahead of raising further funding for commercialising the technology in a big way,” Cytera added.

The chair of Spectral Edge is Robert Swann, a serial entrepreneur who had previously co-founded Cambridge-based image processing pioneer Alphamosaic, which was sold in 2004 to Broadcom in a cash-and-share deal valued at $123 million. More recently, Swann also worked with professor Finlayson on another spinoff from the University of East Anglia, Im-Sense, which developed the “EyeFidelity” technology that enhances digital photographs and videos by adaptively enhancing dynamic range. This was sold in 2010 to Apple and used in the company’s iPhones.

via Image fusion tech improves experience of colour-blind viewers.