Sign up for our daily Newsletter and stay up to date with all the latest news!

Subscribe I am already a subscriber

You are using software which is blocking our advertisements (adblocker).

As we provide the news for free, we are relying on revenues from our banners. So please disable your adblocker and reload the page to continue using this site.
Thanks!

Click here for a guide on disabling your adblocker.

Sign up for our daily Newsletter and stay up to date with all the latest news!

Subscribe I am already a subscriber

Smartphone screens effective sensors for soil or water contamination

The touchscreen technology used in billions of smartphones and tablets could also be used as a powerful sensor, without the need for any modifications. 

Researchers from the University of Cambridge have demonstrated how a typical touchscreen could be used to identify common ionic contaminants in soil or drinking water by dropping liquid samples on the screen, the first time this has been achieved. The sensitivity of the touchscreen sensor is comparable to typical lab-based equipment, which would make it useful in low-resource settings. 

"Instead of interpreting a signal from your finger, what if we could get a touchscreen to read electrolytes since these ions also interact with the electric fields?" said Dr. Ronan Daly from Cambridge’s Institute for Manufacturing (IfM), part of the Department of Engineering, who co-led the research.  

Quick check-up
The researchers say their proof of concept could one day be expanded for a wide range of sensing applications, including for biosensing or medical diagnostics, right from the phone in your pocket. The results are reported in the journal Sensors and Actuators B. 

Touchscreen technology is ubiquitous in our everyday lives: the screen on a typical smartphone is covered in a grid of electrodes, and when a finger disrupts the local electric field of these electrodes, the phone interprets the signal. 

Other teams have used the computational power of a smartphone for sensing applications, but these have relied on the camera or peripheral devices, or have required significant changes to be made to the screen. “We wanted to know if we could interact with the technology in a different way, without having to fundamentally change the screen,” said Dr. Ronan.  

Read the complete article at Cambridge University 

For more information:
University of Cambridge
www.cam.ac.uk 

 

Publication date: