Apple researchers, in collaboration with Carnegie Mellon University, have developed the Never-Ending UI Learner AI system. It continuously interacts with mobile applications to improve its understanding of UI design patterns and new trends. The system autonomously explores apps, performing actions and classifying UI elements. The collected data trains models to predict tappability, draggability, and screen similarity. This approach allows the system to identify challenging scenarios that human-labeled datasets might overlook. The researchers demonstrated that after five training rounds, tappability prediction reached 86% accuracy. Apple hopes to apply these principles to learn more sophisticated representations of mobile UIs and interaction patterns.
Revolutionizing App Accessibility Through Continuous Machine Learning
Machine learning is being integrated into various industries, including user interfaces (UIs), to anticipate semantic data. This improves accessibility, simplifies testing, and automates UI-related tasks, leading to more effective applications.
Traditionally, models rely on datasets of static screenshots rated by humans. However, this approach is costly and prone to errors. Human annotators evaluate UI elements based on visual clues from snapshots instead of interacting with the live app.
To address this, Apple researchers collaborated with Carnegie Mellon University to develop the Never-Ending UI Learner AI system. This system continuously interacts with real mobile applications, enhancing its understanding of UI design patterns and trends. It downloads apps from stores, thoroughly examines them, and identifies fresh and challenging training scenarios.
The Never-Ending UI Learner has explored over 5,000 device hours, performing 500,000 actions across 6,000 apps. It trains three computer vision models: tappability prediction, draggability prediction, and screen similarity determination.
The system interacts with UI components through taps and swipes, classifying elements based on heuristics. The collected data trains models to forecast the tappability and draggability of UI elements, as well as the similarity of screens. Human-labeled examples are not required.
This active investigation of apps helps the machine identify challenging scenarios that human-labeled datasets may overlook. The crawler can tap on items and observe the outcomes, providing clearer information.
The models trained on this data improve over time, with tappability prediction reaching 86% accuracy after five training rounds.
Never-ending learning enables systems to continually adapt and advance as they gather more data. While the current system focuses on simple semantics like tappability, Apple aims to apply similar principles to learn more complex representations of mobile UIs and interaction patterns.
Practical AI Solutions for Middle Managers
If you want to evolve your company with AI and stay competitive, consider leveraging the Never-Ending UI Learner. It revolutionizes app accessibility through continuous machine learning.
Here are some practical steps to incorporate AI into your workflow:
- Identify Automation Opportunities: Locate key customer interaction points that can benefit from AI.
- Define KPIs: Ensure your AI endeavors have measurable impacts on business outcomes.
- Select an AI Solution: Choose tools that align with your needs and provide customization.
- Implement Gradually: Start with a pilot, gather data, and expand AI usage judiciously.
For AI KPI management advice, connect with us at hello@itinai.com. Stay tuned on Telegram t.me/itinainews or Twitter @itinaicom for continuous insights into leveraging AI.
Spotlight on a Practical AI Solution:
Consider the AI Sales Bot from itinai.com/aisalesbot, designed to automate customer engagement 24/7 and manage interactions across all customer journey stages. Explore how AI can redefine your sales processes and customer engagement.