Calendar An icon of a desk calendar. Cancel An icon of a circle with a diagonal line across. Caret An icon of a block arrow pointing to the right. Email An icon of a paper envelope. Facebook An icon of the Facebook "f" mark. Google An icon of the Google "G" mark. Linked In An icon of the Linked In "in" mark. Logout An icon representing logout. Profile An icon that resembles human head and shoulders. Telephone An icon of a traditional telephone receiver. Tick An icon of a tick mark. Is Public An icon of a human eye and eyelashes. Is Not Public An icon of a human eye and eyelashes with a diagonal line through it. Pause Icon A two-lined pause icon for stopping interactions. Quote Mark A opening quote mark. Quote Mark A closing quote mark. Arrow An icon of an arrow. Folder An icon of a paper folder. Breaking An icon of an exclamation mark on a circular background. Camera An icon of a digital camera. Caret An icon of a caret arrow. Clock An icon of a clock face. Close An icon of the an X shape. Close Icon An icon used to represent where to interact to collapse or dismiss a component Comment An icon of a speech bubble. Comments An icon of a speech bubble, denoting user comments. Comments An icon of a speech bubble, denoting user comments. Ellipsis An icon of 3 horizontal dots. Envelope An icon of a paper envelope. Facebook An icon of a facebook f logo. Camera An icon of a digital camera. Home An icon of a house. Instagram An icon of the Instagram logo. LinkedIn An icon of the LinkedIn logo. Magnifying Glass An icon of a magnifying glass. Search Icon A magnifying glass icon that is used to represent the function of searching. Menu An icon of 3 horizontal lines. Hamburger Menu Icon An icon used to represent a collapsed menu. Next An icon of an arrow pointing to the right. Notice An explanation mark centred inside a circle. Previous An icon of an arrow pointing to the left. Rating An icon of a star. Tag An icon of a tag. Twitter An icon of the Twitter logo. Video Camera An icon of a video camera shape. Speech Bubble Icon A icon displaying a speech bubble WhatsApp An icon of the WhatsApp logo. Information An icon of an information logo. Plus A mathematical 'plus' symbol. Duration An icon indicating Time. Success Tick An icon of a green tick. Success Tick Timeout An icon of a greyed out success tick. Loading Spinner An icon of a loading spinner. Facebook Messenger An icon of the facebook messenger app logo. Facebook An icon of a facebook f logo. Facebook Messenger An icon of the Twitter app logo. LinkedIn An icon of the LinkedIn logo. WhatsApp Messenger An icon of the Whatsapp messenger app logo. Email An icon of an mail envelope. Copy link A decentered black square over a white square.

A technological revolution starts in St Andrews

Some of the ways the new technology would help
Some of the ways the new technology would help

St Andrews experts have created a revolutionary piece of technology which can detect what an object is by placing it on a small radar sensor.

RadarCat can be trained to recognise different objects and materials, from a drinking glass to a computer keyboard, and can even identify individual body parts.

It has the potential to help blind people identify the different contents of two identical bottles, could replace bar codes at checkout, automatically sort waste or help learning a foreign language.

Designed by computer scientists at the St Andrews Computer Human Interaction research group, the sensor was originally provided by Google ATAP.

The radar-based sensor was developed to sense micro and subtle motion of human fingers, but the team at St Andrews discovered it could be used for much more.

Professor Aaron Quigley, the chairman of human computer interaction at the university, said: “The Soli miniature radar opens up a wide range of new forms of touchless interaction.

“Once Soli is deployed in products, our RadarCat solution can revolutionise how people interact with a computer, using everyday objects that can be found in the office or home, for new applications and novel types of interaction.”

The system could be used in conjunction with a mobile phone, for example it could be trained to open a recipe app when a person holds a phone to their stomach, or change its settings when operating with a gloved hand.

A team of undergraduates and postgraduate students at the university’s school of computer science was selected to show the project to Google in Mountain View in America earlier this year.

A snippet of the video was also shown on stage during the Google’s annual conference.

Professor Quigley said: “Our future work will explore object and wearable interaction, new features and fewer sample points to explore the limits of object discrimination.

“Beyond human computer interaction, we can also envisage a wide range of potential applications ranging from navigation and world knowledge to industrial or laboratory process control.”