|
Disertaciones |
|
|
1
|
-
BRUNA ALICE OLIVEIRA DE BRITO
-
Development of an Intelligent System for Analysis and Management of Medical Texts about Cancer in Electronic Medical Records with Natural Language Processing
-
Líder : ITAMIR DE MORAIS BARROCA FILHO
-
MIEMBROS DE LA BANCA :
-
AMALIA CINTHIA MENESES DO REGO
-
ELIAS JACOB DE MENEZES NETO
-
ITAMIR DE MORAIS BARROCA FILHO
-
JEAN MARIO MOREIRA DE LIMA
-
Data: 28-ene-2025
-
-
Resumen Espectáculo
-
With the advancement of technology in healthcare sector in Brazil, Electronic Health Records (EHRs) have become an essential means in the management of clinical data, especially in the monitoring of chronic diseases like cancer, one of the main causes of mortality in the country. EHRs store a large amount of relevant information about the patient, including clinical history, diagnoses, and treatments. However, much of this data is in free-text format and lacks standardization, which makes its analysis and interpretation difficult.
In this context, Natural Language Processing (NLP) and Machine Learning (ML) techniques emerge as effective solutions for analyzing this information, enabling, for example, the automation of data extraction and text summarization. Thus, the objective of this work is to develop an intelligent system capable of extracting and organizing clinical information related to cancer present in EHRs, in order to allow efficient analysis and evaluation by the clinical team and management.
This work will utilize models based on transformer architectures, such as BERT and GPT, to identify medical entities and automatically summarize patient information. The application of these models aims to overcome the challenges associated with the complexity of clinical data and the scarcity of labeled datasets. The methodology will include the application of NLP techniques, the validation of the models, and the creation of dashboards for the visualization of processed data. It is expected that the results will contribute to improving the management and care of oncology patients by automating processes, optimizing the time of health professionals, and promoting a more precise analysis based on clinical evidence.
|
|
|
2
|
-
MARCELO MARTINS PINTO
-
Development of a System to Support the Adoption of DevSecOps
-
Líder : RAMON DOS REIS FONTES
-
MIEMBROS DE LA BANCA :
-
ITAMIR DE MORAIS BARROCA FILHO
-
RAMON DOS REIS FONTES
-
RODRIGO ROCHA GOMES E SOUZA
-
ROGER KREUTZ IMMICH
-
Data: 25-feb-2025
-
-
Resumen Espectáculo
-
The Brazilian Judiciary, made up of ninety-four Courts and their respective Information and Communication Technology departments, faces the crucial challenge of unifying its judicial systems. Despite the efforts of the National Justice Council (CNJ), a significant disparity between administrative and support systems still persists. This diversity of solutions, combined with different personnel structures, technical capabilities and infrastructures, makes it difficult to control the applications in use, meet deadlines and, especially, software security. To address these challenges, it is proposed to implement software that supports IT areas in adopting the DevSecOps methodology, integrating development, security and operations. This approach aims to break down knowledge silos, distribute responsibilities and information more efficiently, increase transparency in the IT area, improve quality and reduce development time, in addition to optimizing software maintenance throughout its life cycle. . The development of the solution was based on a literature review to identify best practices and tools for vulnerability analysis, elicitation of minimum functional and non-functional requirements, choice of appropriate technology for programming language, development tools and database. Use cases and tools for proof of concept were defined, focused on application security assessments, using calls to the application programming interface (API) and presenting results through a web interface. Key benefits identified include centralized visualization of information about applications and their dependencies, effective vulnerability analysis, flexible integration of security tools, and expanded visibility into application security for the entire IT team.
|
|
|
3
|
-
LUIS FHELIPE RIBEIRO GOMES NETTO MARINHO
-
Architecture for Network Configuration Automation Using Infrastructure as Code (IaC)
-
Líder : ROGER KREUTZ IMMICH
-
MIEMBROS DE LA BANCA :
-
AUGUSTO JOSE VENANCIO NETO
-
MARCOS CESAR MADRUGA ALVES PINHEIRO
-
RAMON DOS REIS FONTES
-
RODOLFO IPOLITO MENEGUETTE
-
ROGER KREUTZ IMMICH
-
Data: 27-mar-2025
-
-
Resumen Espectáculo
-
The demand for communication infrastructure has increased drastically in recent years, driving the adoption of new technologies with a focus on automation, reconfiguration, error detection, and high availability. This shift is leveraged through increasingly programmable networks, particularly with the use of technologies like Software-Defined Networking (SDN) and methodologies such as Infrastructure as Code (IaC), allowing networks to be managed as software and opening up a range of possibilities for controlling network traffic flows. These processes require network assets to support protocols that enable this programmability, such as OpenFlow, and the transmission of telemetry information through structured data formats like YAML. However, these new methodologies are not easily applied to a legacy equipments, which often lacks adequate support for these new protocols. In this context, this work aims to develop an architecture that allows both legacy and modern network equipment to be managed through an IaC methodology, enabling the automation of routine tasks, fault detection, early identification of potential network issues, and configuration checks in a production corporate environment. The focus of this work is on network access equipment, as they are more numerous and geographically dispersed, allowing for standardized configuration of assets and better control over access to connectivity resources. The architecture consists of eight modular and independent components, enabling individual modification and inclusion according to the needs of the network equipment. As partial results, the configuration of a family of network equipment was consolidated and standardized based on a model, avoiding human configuration errors and allowing for periodic validations in the configuration of these assets, with the architecture ready for the development of automation methodologies.
|
|
|
4
|
-
FRANCISCO FABIO DE OLIVEIRA
-
Toweards an IoT and Fuzzy Logic-Based Architecture to Optimize Water Resource Utilization
-
Líder : ROGER KREUTZ IMMICH
-
MIEMBROS DE LA BANCA :
-
GUSTAVO GIRAO BARRETO DA SILVA
-
RAFAEL LOPES GOMES
-
RODOLFO IPOLITO MENEGUETTE
-
ROGER KREUTZ IMMICH
-
Data: 14-jul-2025
-
-
Resumen Espectáculo
-
The increasing demand for food security and the scarcity of natural resources pose global challenges, particularly in agriculture, which accounts for a significant share of water consumption. In Brazil, agriculture holds great economic importance; however, its intensive water use necessitates the adoption of more sustainable practices. Managing irrigation adaptively and efficiently is challenging due to the complexity of multiple environmental variables. This reduces water use efficiency and negatively impacts agricultural productivity. IrrigaFlow is a modular architecture that automates irrigation through IoT, fuzzy logic, and distributed processing. It is composed of three layers: IoT Module, Network Edge, and Cloud, enabling real-time monitoring and adjustments based on local environmental data. This approach optimizes water use and enhances responsiveness to climatic conditions. Simulations conducted in a controlled environment allowed for evaluating the architecture’s efficiency across varied scenarios, highlighting its capacity to adjust water management to the specific needs of each crop and environment. IrrigaFlow emerges as a promising alternative to address food security challenges, reduce water waste, and promote sustainable agricultural practices.
|
|
|
5
|
-
ALIKSON SUEL COSTA DE OLIVEIRA
-
Strategic Planning Monitoring Based on Processes: An Integrated Tool with SUAP for Public Higher Education Institutions.
-
Líder : JEAN MARIO MOREIRA DE LIMA
-
MIEMBROS DE LA BANCA :
-
ANDRÉ GUSTAVO DUARTE DE ALMEIDA
-
ITAMIR DE MORAIS BARROCA FILHO
-
JEAN MARIO MOREIRA DE LIMA
-
Data: 07-ago-2025
-
-
Resumen Espectáculo
-
Public higher education institutions face significant challenges in integrating strategic planning with budget execution. The lack of an integrated system for tracking planned and executed resources results in difficulties in maintaining a clear and up-to-date view of expenditures, compromising transparency, financial control, and resource management efficiency. Currently, budget control is fragmented and, in many cases, carried out manually or through systems that do not provide a consolidated view of the data, making strategic decision-making more difficult. To address these challenges, the proposed solution is the creation of a computational tool integrated with the Unified Public Administration System (SUAP). This tool will allow for monitoring strategic planning and budget execution within the case study of IFRN, centralizing and automating processes such as budget reallocation, credit movement, and expense requests, using Python and Django to ensure execution aligns with institutional planning. The tool's development includes planning, development, testing, validation, and staff training. Although the tool is still under development, the expected results are promising. Operational efficiency is anticipated to improve by reducing the time and effort spent on budget control, allowing those involved to focus on strategic decisions. The centralization and automation of processes will eliminate fragmentation and the risk of inconsistencies, while updated and centralized data will enable managers to make more effective decisions aligned with the institution's strategic goals. These results will contribute to more efficient and transparent public resource management, strengthening IFRN's institutional performance and potentially serving as a model for other federal institutions.
|
|
|
6
|
-
SAINT CLAIR DA CUNHA LIMA
-
Assistente de Busca: A RAG approach for semantic search in documents from ALERN
-
Líder : DANIEL SABINO AMORIM DE ARAUJO
-
MIEMBROS DE LA BANCA :
-
ANDRE MORAIS GURGEL
-
DANIEL SABINO AMORIM DE ARAUJO
-
ELIAS JACOB DE MENEZES NETO
-
THAIS GAUDENCIO DO REGO
-
Data: 25-ago-2025
-
-
Resumen Espectáculo
-
The unprecedented growth in the creation and persistence of unsctructured textual documents in public institutions poses challenges for efficient information retrieval and data analysis. This research addresses these challenges, by proposing a prototype of a search assistant using the Retrieval-Augmented Generation (RAG) approach, specifically applied to documents produced by Assembleia Legislativa do Estado do Rio Grande do Norte (Alern). The proposed system leverages Natural Language Processing (NLP) techniques, vector databases, and Large Language Models (LLMs) to enable semantic search and the generation of relevant content as answers to query inputs. The research introduces an architecture capable of retrieving document fragments based on semantic similarity. User-provided queries are processed and used to search content with contextual relevance, which is then synthesized into coherent and contextually appropriate responses through an LLM. Results from automated evaluations using BERTScore demonstrate the system's effectiveness in retrieving information based on user input data -- with precision and recall achieving values of 79% and 69% respectively, which are satisfactory values in text generation scenarios. Being powered by the RAG approach, the proposed assistant not only reduces the cognitive load associated with the manual analysis of large document collections but also provides a scalable and adaptable solution for continuously evolving datasets. This research contributes to bridging the gap between the availability of public data and the generation of searchable information, aligning with goals of transparency and accessibility in the legislative environment.
|
|
|
7
|
-
VITOR GONÇALEZ LINDBERGH
-
Engineering a Permissioned Blockchain Architecture for Cross-Institutional EHR Sharing
-
Líder : ITAMIR DE MORAIS BARROCA FILHO
-
MIEMBROS DE LA BANCA :
-
DIEGO DA SILVA PEREIRA
-
ITAMIR DE MORAIS BARROCA FILHO
-
RAMON DOS REIS FONTES
-
Data: 28-ago-2025
-
-
Resumen Espectáculo
-
The healthcare industry faces critical challenges in managing Electronic Health
Records (EHRs), particularly regarding data security, interoperability, and patient privacy.
Traditional centralized systems often lead to fragmented data storage, vulnerabilities, and
inefficiencies that compromise care and integrity. This dissertation explores how blockchain
technology—leveraging decentralization, immutability, and transparency—can enhance se-
cure and interoperable EHR management. A systematic mapping study of 35 peer-reviewed
papers from Scopus highlighted blockchain’s potential to improve interoperability, security,
and patient control, while also identifying ongoing concerns with scalability, privacy, and
regulatory compliance. In response, this work proposes a blockchain-based architecture
that defines clear functional and non-functional requirements focused on security, interop-
erability, and consent management. The solution integrates smart contracts, consensus
mechanisms, and a permissioned blockchain network to enable robust, auditable data
exchange across healthcare institutions. A proof-of-concept (PoC) built using Hyperledger
Fabric demonstrates secure EHR transactions, fine-grained access control, and patient-
driven consent management. The PoC was validated against defined Quality of Service
(QoS) metrics, confirming technical feasibility and alignment with system goals. This
research contributes a viable architectural approach for secure and efficient EHR sharing
and lays the foundation for future work on scalability, infrastructure integration, and
regulatory alignment.
|
|
|
8
|
-
WELLIGTON MIGUEL DA SILVA
-
Historical analysis of architectural violations in Clojure systems: a longitudinal quantitative study
-
Líder : EIJI ADACHI MEDEIROS BARBOSA
-
MIEMBROS DE LA BANCA :
-
EIJI ADACHI MEDEIROS BARBOSA
-
UIRA KULESZA
-
ELDER JOSÉ REIOLI CIRILO
-
Data: 19-sep-2025
-
-
Resumen Espectáculo
-
Software architecture consists of fundamental decisions that guide the construction and evolution of a system. Over time, accumulated modifications throughout the software lifecycle can violate previously defined criteria, characterizing architectural degradation. The absence of a formal continuous evaluation process causes these problems to be detected reactively, often only after causing significant impacts. To mitigate these risks, static analysis tools can be adopted to validate architecture proactively, enabling constant monitoring. One example of a tool that follows this approach is Clj-Depend, which validates dependencies between modules in systems implemented using the Clojure programming language. In this context, this work aims to investigate the occurrence and evolution of these violations in systems implemented in Clojure. For this purpose, a quantitative longitudinal study based on software repository mining was conducted, involving five services from a multinational technology company, where they process on average more than eight hundred requests per minute. The results showed that all services presented violation accumulation correlated with code base growth, with concentration in few critical rules. Statistical analyses confirmed significant association between lines of code and architectural degradation when adequate governance is absent. Despite scope limitations, the study demonstrated that architectural degradation is monitorable and can be integrated into the team’s development workflow through automated processes. The developed protocol offers a systematic approach for continuous architectural auditing, contributing to more sustainable software evolution practices.
|
|
|
9
|
-
NALBERT GABRIEL MELO LEAL
-
Automatic error detection techniques applied to supervised datasets for handling labels from weakly supervised learning pipelines
-
Líder : DANIEL SABINO AMORIM DE ARAUJO
-
MIEMBROS DE LA BANCA :
-
ARAKEN DE MEDEIROS SANTOS
-
DANIEL SABINO AMORIM DE ARAUJO
-
ELIAS JACOB DE MENEZES NETO
-
JOAO CARLOS XAVIER JUNIOR
-
Data: 22-sep-2025
-
-
Resumen Espectáculo
-
The high cost of data labeling for training machine learning models has motivated the development of weakly supervised learning(WSL); in turn,this introduces noise into the labels, affecting the models’ performance. Among WSL techniques, data programming (DP) stands out by using noisy sources (such as heuristics and pre-trained models) to perform automated data labeling at a low cost, resulting in potentially inaccurate labels that impact the end-model’s performance. The objective of this work is to evaluate whether techniques that detect noisy instances can improve the performance of the final model obtained with the DP pipeline for classification tasks. For this, an experiment was conducted to identify the impact on performance and cost that the use of noisy instance detection has on the DP pipeline. Some of the techniques for the experiment were already known by the author but not previously linked to WSL, while others were selected from a literature review that searched for noise detection techniques already applied to WSL. The impact of each technique on the end-model’s performance was evaluated by the Matthews correlation coefficient metric and the cost by the execution time of the pipeline in which the technique was introduced. The results demonstrate that the application of detection techniques, in most cases, degraded the performance of the end-models in a statistically significant manner. Only 4% of the pipelines with detection showed a performance improvement that was statistically significant and superior to the baseline. The improvements, when they occurred, were sporadic and accompanied by a high computational cost. Furthermore, the baselines, especially those with the hyper label model and majority vote LMs, showed a better balance between performance and cost. Thus, the DP pipeline without detection techniques proved to be a more efficient approach.
|
|
|
10
|
-
HELTON PIERRE LUCENA DE MEDEIROS
-
An IoT-based Air Quality Monitoring Platform
-
Líder : GUSTAVO GIRAO BARRETO DA SILVA
-
MIEMBROS DE LA BANCA :
-
GUSTAVO GIRAO BARRETO DA SILVA
-
ITAMIR DE MORAIS BARROCA FILHO
-
IVAN SARAIVA SILVA
-
ROGER KREUTZ IMMICH
-
Data: 26-sep-2025
-
-
Resumen Espectáculo
-
This dissertation investigates the limited coverage of air-quality monitoring in Brazil and proposes an alternative solution: an IoT-based platform capable of expanding environmen- tal observability and operational automation. The object of study is a layered, software- and hardware-agnostic architecture designed to integrate acquisition (IoT devices), trans- mission, processing and assessment (calculation of the IQAr), time-series persistence, and interactive visualization, with real-time notifications. The overall objective was to design and validate, in a reproducible testbed, such an architecture; the specific objectives compri- sed: (i) modeling and specifying a layered architecture for IoT-based air-quality monitoring; (ii) implementing the experimental environment (testbed) with devices and sensors simula- ted in Node-RED; (iii) demonstrating Zabbix’s potential as a monitoring platform applied both to air quality and to IoT devices; and (iv) analyzing, from a technical-operational standpoint, the results produced by the testbed and the feasibility of the proposed solution. All objectives were achieved. The results show automated configuration of monitoring via Zabbix templates, dashboards that display a map and per-station time series (with pollutant-level detail and meteorological correlation), device and sensor supervision, and real-time alerts delivered via instant messenger. The architecture proved applicable to the technological arrangement used. Zabbix exhibited very efficient performance and full compatibility with the application context, standing out as a tool with high potential for monitoring IoT devices and air-quality systems; furthermore, the scarcity of academic studies analyzing its use in this domain underscores this research’s contribution. As an overall contribution, the work delivers a vendor-agnostic architectural guide, a replicable testbed, and a robust proof of concept of the Node-RED–MQTT–Zabbix–Grafana inte- gration for environmental monitoring, pointing to a concrete, scalable, and economically viable path to expand air-quality monitoring coverage in the country.
|
|
|
11
|
-
ALEXANDRE ESTRELA DE LACERDA NOBREGA
-
Proposed machine learning-based tool for precipitation time series forecasting under climate change scenarios
-
Líder : ITAMIR DE MORAIS BARROCA FILHO
-
MIEMBROS DE LA BANCA :
-
ANDRE MORAIS GURGEL
-
DANIEL SABINO AMORIM DE ARAUJO
-
GUTEMBERG GONÇALVES DOS SANTOS JÚNIOR
-
ITAMIR DE MORAIS BARROCA FILHO
-
JUSCIMARA GOMES AVELINO
-
Data: 01-dic-2025
-
-
Resumen Espectáculo
-
Climate change, largely driven by human activity, has led to severe global impacts, particularly the increase in extreme weather events. The projection of these changes relies on global climate models, which simulate climate behavior based on different emission scenarios. However, these models have low spatial resolution, limiting their direct application to regional and local contexts. To address this limitation, statistical downscaling techniques have been employed to generate time series of climate variables, such as precipitation, with greater regional detail. With technological advancements, artificial intelligence and machine learning techniques have shown promising results in the development of such models, often outperforming traditional methods. Nevertheless, their application remains limited, partly due to the lack of standardized procedures and the complexity involved. Furthermore, after applying statistical downscaling, the generated data are restricted to specific adjusted points, requiring the use of spatialization models to estimate values in surrounding areas. This entire process, from downscaling to spatialization, is often inaccessible to professionals who need these data but lack technical expertise in the methodologies used. In this context, the present study investigated the main techniques employed in statistical downscaling and proposed a standardized workflow for their application. The research was conducted in a specific region as a case study, integrating data from global climate models with local information. Based on this, a scalable computational tool was developed, featuring an intuitive interface capable of applying the constructed models and generating time series data in a practical and accessible manner, based on the user’s specified location.
|
|