Banca de DEFESA: THOMMAS KEVIN SALES FLORES

Uma banca de DEFESA de DOUTORADO foi cadastrada pelo programa.
STUDENT : THOMMAS KEVIN SALES FLORES
DATE: 06/02/2026
TIME: 09:00
LOCAL: Remoto
TITLE:

Evolving Vector Quantization-Aware Training: An Adaptive Method for Compressing Machine Learning Models


KEY WORDS:

TinyML;Evolving Vector Quantization; Quantization-Aware Training; Embedded Systems.


PAGES: 80
BIG AREA: Engenharias
AREA: Engenharia Elétrica
SUMMARY:
The rapid evolution of Artificial Intelligence and the rise of Large Language Models impose significant barriers to implementation on edge devices, as reliance on cloud infrastructures compromises data privacy, increases latency, and renders critical operations unfeasible in locations with unstable connectivity. Given the insufficiency of traditional static compression techniques, which frequently degrade accuracy or rely on opaque execution libraries, this thesis presents Evolving Vector Quantization-Aware Training (EVQAT) as an adaptive and auditable solution to these challenges. This innovative methodology is founded on the application of evolving vector quantization theory directly within training and post-training cycles, treating neural network weights as continuous data streams that allow incremental clustering algorithms, such as AutoCloud, Mean-Shift, and Affinity Propagation, to dynamically adjust quantization codebooks in synchronization with parameter optimization. In this context, a codebook refers to an optimized set of prototype vectors, or centroids, acting as a compact dictionary to represent and substitute original high-dimensional data, thereby enabling extreme model compression. This strategy ensures that the partitioning of the vector space continuously adapts to the evolution of learning, mitigating quantization noise and preserving the essential statistical distribution of models ranging from multilayer neural networks to Transformer architectures. The robustness of this approach has been validated across a comprehensive spectrum of scenarios, including standard training without quantization, quantization-aware training in both evolutionary and Int8 modalities, and post-training quantization also in evolutionary and Int8 variants, with successful implementations across different microcontroller platforms. Distinguishing itself from market black-box solutions, the proposed solution guarantees explainability, auditability, and total interoperability by automatically generating pure, self-contained C++ code capable of executing directly on diverse hardware architectures without dependence on third-party libraries or specific operating systems.

COMMITTEE MEMBERS:
Presidente - 2885532 - IVANOVITCH MEDEIROS DANTAS DA SILVA
Interno - 2579664 - ALLAN DE MEDEIROS MARTINS
Interno - 1153006 - LUIZ AFFONSO HENDERSON GUEDES DE OLIVEIRA
Externo à Instituição - TIAGO FIGUEIREDO VIEIRA - UFAL
Externo à Instituição - DANIEL GOUVEIA COSTA - FEUP
Externo à Instituição - JUAN MOISES MAURICIO VILLANUEVA - UFPB
Notícia cadastrada em: 07/01/2026 08:26
SIGAA | Superintendência de Tecnologia da Informação - (84) 3342 2210 | Copyright © 2006-2026 - UFRN - sigaa02-producao.info.ufrn.br.sigaa02-producao