Lizhi Liao

ORCID: 0000-0001-9920-5855
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Software System Performance and Reliability
  • Software Reliability and Analysis Research
  • Software Engineering Research
  • Software Testing and Debugging Techniques
  • Service-Oriented Architecture and Web Services
  • Distributed and Parallel Computing Systems
  • Cloud Computing and Resource Management
  • Advanced Neural Network Applications
  • Software-Defined Networks and 5G
  • Adversarial Robustness in Machine Learning
  • Machine Learning and Data Classification
  • Power Systems Fault Detection
  • Belt Conveyor Systems Engineering
  • IoT and Edge/Fog Computing
  • RFID technology advancements

Memorial University of Newfoundland
2024

University of Waterloo
2023-2024

Concordia University
2020-2023

Deep neural network (DNN) models typically have many hyperparameters that can be configured to achieve optimal performance on a particular dataset. Practitioners usually tune the of their DNN by training number trial with different configurations hyperparameters, find hyperparameter configuration maximizes accuracy or minimizes loss. As such tuning focuses model loss function, it is not clear and remains under-explored how process impacts other properties models, as inference latency size....

10.1145/3506695 article EN ACM Transactions on Software Engineering and Methodology 2022-01-31

Software developers usually rely on in-house performance testing to detect regressions and locate their root causes. Such is typically resource time-consuming, making it impractical conduct when the software delivered in fast-paced release cycles. On other hand, operational data generated eld environment provides rich information about of a system its runtime activities. Therefore, this work explores idea leveraging readily-available causes regression instead running expensive tests....

10.1109/tse.2021.3131529 article EN publisher-specific-oa IEEE Transactions on Software Engineering 2021-01-01

As software systems continuously grow in size and complexity, performance load related issues have become more common than functional issues. Load testing is usually performed before releases to ensure that the system can still provide quality service under a certain load. Therefore, one of challenges design realistic workloads represent actual workload field. In particular, most widely adopted intuitive approaches directly replay field environment. However, replaying lengthy, e.g., 48...

10.1109/tse.2024.3408079 article EN IEEE Transactions on Software Engineering 2024-05-31

During software development, developers often make numerous modifications to the address existing issues or implement new features. However, certain changes may inadvertently have a detrimental impact on overall system performance. To ensure that performance of releases does not degrade, practices rely system-level testing, such as load component-level testing detect regressions. for entire is expensive and time-consuming, posing challenges adapting rapid release cycles common in modern...

10.48550/arxiv.2408.08148 preprint EN arXiv (Cornell University) 2024-08-15

Database-centric architectures have been widely adopted in large-scale software systems various domains to deal with the ever-increasing amount and complexity of data. Prior studies proposed a wide range performance analytic techniques aimed at assisting developers pinpointing inefficiencies diagnosing issues. However, directly applying these existing database-centric can be challenging may not perform well due unique nature such systems. In particular, compared typical database-based like...

10.1145/3611643.3613893 article EN 2023-11-30

Performance regression is an important type of performance issue in software systems. It indicates that the same features new version system becomes worse than previous versions, such as increased response time or higher resource utilization. In order to prevent regressions, current practices often rely on conducting extensive testing before releasing into production based results. However, faced with a great demand for resources and perform testing, it challenging adopt approaches practice...

10.1109/icse-companion58688.2023.00056 article EN 2022 IEEE/ACM 44th International Conference on Software Engineering: Companion Proceedings (ICSE-Companion) 2023-05-01

Context. While in serverless computing, application resource management and operational concerns are generally delegated to the cloud provider, ensuring that applications meet their performance requirements is still a responsibility of developers. Performance testing commonly used assessment practice; however, it traditionally requires visibility environment. Objective. In this study, we investigate whether tests stable, is, if results reproducible, what implications paradigm has for tests....

10.48550/arxiv.2107.13320 preprint EN other-oa arXiv (Cornell University) 2021-01-01
Coming Soon ...