HYLR-FO: Hybrid Approach Using Language Models and Rule-Based Systems for On-Device Food Ordering
DOI:
10.3390/electronics14040775
Publication Date:
2025-02-17T16:31:17Z
AUTHORS (3)
ABSTRACT
Recent research has explored combining large language models (LLMs) with speech recognition for various services, but such applications require a strong network environment for quality service delivery. For on-device services, which do not rely on networks, resource limitations must be considered. This study proposes HYLR-FO, an efficient model that integrates a smaller language model (LM) and a rule-based system (RBS) to enable fast and reliable voice-based order processing in resource-constrained environments, approximating the performance of LLMs. By considering potential error scenarios and leveraging flexible natural language processing (NLP) and inference validation, this approach ensures both efficiency and robustness in order execution. Smaller LMs are used instead of LLMs to reduce resource usage. The LM transforms speech input, received via automatic speech recognition (ASR), into a consistent form that can be processed by the RBS. The RBS then extracts the order and validates the extracted information. The experimental results show that HYLR-FO, trained and tested on 5000 order data samples, achieves up to 86% accuracy, comparable to the 90% accuracy of LLMs. Additionally, HYLR-FO achieves a processing speed of up to 55 orders per second, significantly outperforming LLM-based approaches, which handle only 1.14 orders per second. This results in a 48.25-fold improvement in processing speed in resource-constrained environments. This study demonstrates that HYLR-FO provides faster processing and achieves accuracy similar to LLMs in resource-constrained on-device environments. This finding has theoretical implications for optimizing LM efficiency in constrained settings and practical implications for real-time low-resource AI applications. Specifically, the design of HYLR-FO suggests its potential for efficient deployment in various commercial environments, achieving fast response times and low resource consumption with smaller models.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (50)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....