Standard touchscreen interfaces need an algorithmic makeover to improve accessibility for those with physical and mental impairments, new research suggests.

And a team of researchers from Kochi University of Technology, Japan, and Aalto University, Finland, have thrown down the gauntlet to designers to tap into the new artificial intelligence (AI) inspired model they have developed to offer solutions to the limitations a ‘one size fits all’ interface creates.

The model is designed to enable people with challenges like dyslexia, Alzheimer’s, or tremors, effectively interact with their technology. And in a demonstration the AI model was used to ‘simulate a user with essential tremor’ – which found that the Qwerty keyboard on the smartphone wasn’t fit for purpose.

“After this prediction, we connected the text entry model to an optimizer, which iterates through thousands of different user interface designs. No real user could of course try out all these designs. For this reason it is important that we could automatise the evaluation with our computational model,” said Jussi Jokinen, postdoctoral researcher at Aalto University.

The outcome was a ‘text entry interface’ which allowed a ‘real user with essential tremor’ to ‘type almost error-free messages’.

“This is of course just a prototype interface, and not intended for consumer market. Our work as researchers is to come up with solutions,” added Jokinen. “I hope that designers pick up from here and with the help of our model and optimizer create individually targeted, polished interfaces.”