Tecnalia Research & Innovation

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Tecnalia Research & Innovation
Short name
Tecnalia
Country, city
Spain, Donostia / San Sebastian
Publications
3 514
Citations
103 613
h-index
129
Top-3 journals
Sensors
Sensors (48 publications)
Energies
Energies (44 publications)
Top-3 organizations
Top-3 foreign organizations
University of New South Wales
University of New South Wales (46 publications)
University of Tübingen
University of Tübingen (41 publications)

Most cited in 5 years

Ali S., Abuhmed T., El-Sappagh S., Muhammad K., Alonso-Moral J.M., Confalonieri R., Guidotti R., Ser J.D., Díaz-Rodríguez N., Herrera F.
Information Fusion scimago Q1 wos Q1
2023-11-01 citations by CoLab: 562 Abstract  
Artificial intelligence (AI) is currently being utilized in a wide range of sophisticated applications, but the outcomes of many AI models are challenging to comprehend and trust due to their black-box nature. Usually, it is essential to understand the reasoning behind an AI model’s decision-making. Thus, the need for eXplainable AI (XAI) methods for improving trust in AI models has arisen. XAI has become a popular research subject within the AI field in recent years. Existing survey papers have tackled the concepts of XAI, its general terms, and post-hoc explainability methods but there have not been any reviews that have looked at the assessment methods, available tools, XAI datasets, and other related aspects. Therefore, in this comprehensive study, we provide readers with an overview of the current research and trends in this rapidly emerging area with a case study example. The study starts by explaining the background of XAI, common definitions, and summarizing recently proposed techniques in XAI for supervised machine learning. The review divides XAI techniques into four axes using a hierarchical categorization system: (i) data explainability, (ii) model explainability, (iii) post-hoc explainability, and (iv) assessment of explanations. We also introduce available evaluation metrics as well as open-source packages and datasets with future research directions. Then, the significance of explainability in terms of legal demands, user viewpoints, and application orientation is outlined, termed as XAI concerns. This paper advocates for tailoring explanation content to specific user types. An examination of XAI techniques and evaluation was conducted by looking at 410 critical articles, published between January 2016 and October 2022, in reputed journals and using a wide range of research databases as a source of information. The article is aimed at XAI researchers who are interested in making their AI models more trustworthy, as well as towards researchers from other disciplines who are looking for effective XAI methods to complete tasks with confidence while communicating meaning from data.
Salvia M., Reckien D., Pietrapertosa F., Eckersley P., Spyridaki N., Krook-Riekkola A., Olazabal M., De Gregorio Hurtado S., Simoes S.G., Geneletti D., Viguié V., Fokaides P.A., Ioannou B.I., Flamos A., Csete M.S., et. al.
2021-01-01 citations by CoLab: 351 Abstract  
Cities across the globe recognise their role in climate mitigation and are acting to reduce carbon emissions. Knowing whether cities set ambitious climate and energy targets is critical for determining their contribution towards the global 1.5 °C target, partly because it helps to identify areas where further action is necessary. This paper presents a comparative analysis of the mitigation targets of 327 European cities, as declared in their local climate plans. The sample encompasses over 25% of the EU population and includes cities of all sizes across all Member States, plus the UK. The study analyses whether the type of plan, city size, membership of climate networks, and its regional location are associated with different levels of mitigation ambition. Results reveal that 78% of the cities have a GHG emissions reduction target. However, with an average target of 47%, European cities are not on track to reach the Paris Agreement: they need to roughly double their ambitions and efforts. Some cities are ambitious, e.g. 25% of our sample (81) aim to reach carbon neutrality, with the earliest target date being 2020.90% of these cities are members of the Climate Alliance and 75% of the Covenant of Mayors. City size is the strongest predictor for carbon neutrality, whilst climate network(s) membership, combining adaptation and mitigation into a single strategy, and local motivation also play a role. The methods, data, results and analysis of this study can serve as a reference and baseline for tracking climate mitigation ambitions across European and global cities.
Muhammad K., Ullah A., Lloret J., Ser J.D., de Albuquerque V.H.
2021-07-01 citations by CoLab: 346 Abstract  
Advances in information and signal processing technologies have a significant impact on autonomous driving (AD), improving driving safety while minimizing the efforts of human drivers with the help of advanced artificial intelligence (AI) techniques. Recently, deep learning (DL) approaches have solved several real-world problems of complex nature. However, their strengths in terms of control processes for AD have not been deeply investigated and highlighted yet. This survey highlights the power of DL architectures in terms of reliability and efficient real-time performance and overviews state-of-the-art strategies for safe AD, with their major achievements and limitations. Furthermore, it covers major embodiments of DL along the AD pipeline including measurement, analysis, and execution, with a focus on road, lane, vehicle, pedestrian, drowsiness detection, collision avoidance, and traffic sign detection through sensing and vision-based DL methods. In addition, we discuss on the performance of several reviewed methods by using different evaluation metrics, with critics on their pros and cons. Finally, this survey highlights the current issues of safe DL-based AD with a prospect of recommendations for future research, rounding up a reference material for newcomers and researchers willing to join this vibrant area of Intelligent Transportation Systems.
Muhammad K., Khan S., Ser J.D., Albuquerque V.H.
2021-03-01 citations by CoLab: 269 Abstract  
Brain tumor is one of the most dangerous cancers in people of all ages, and its grade recognition is a challenging problem for radiologists in health monitoring and automated diagnosis. Recently, numerous methods based on deep learning have been presented in the literature for brain tumor classification (BTC) in order to assist radiologists for a better diagnostic analysis. In this overview, we present an in-depth review of the surveys published so far and recent deep learning-based methods for BTC. Our survey covers the main steps of deep learning-based BTC methods, including preprocessing, features extraction, and classification, along with their achievements and limitations. We also investigate the state-of-the-art convolutional neural network models for BTC by performing extensive experiments using transfer learning with and without data augmentation. Furthermore, this overview describes available benchmark data sets used for the evaluation of BTC. Finally, this survey does not only look into the past literature on the topic but also steps on it to delve into the future of this area and enumerates some research directions that should be followed in the future, especially for personalized and smart healthcare.
Argüeso D., Picon A., Irusta U., Medela A., San-Emeterio M.G., Bereciartua A., Alvarez-Gila A.
2020-08-01 citations by CoLab: 237 Abstract  
Prompt plant disease detection is critical to prevent plagues and to mitigate their effects on crops. The most accurate automatic algorithms for plant disease identification using plant field images are based on deep learning. These methods require the acquisition and annotation of large image datasets, which is frequently technically or economically unfeasible. This study introduces Few-Shot Learning (FSL) algorithms for plant leaf classification using deep learning with small datasets. For the study 54,303 labeled images from the PlantVillage dataset were used, comprising 38 plant leaf and/or disease types (classes). The data was split into a source (32 classes) and a target (6 classes) domain. The Inception V3 network was fine-tuned in the source domain to learn general plant leaf characteristics. This knowledge was transferred to the target domain to learn new leaf types from few images. FSL using Siamese networks and Triplet loss was used and compared to classical fine-tuning transfer learning. The source and target domain sets were split into a training set (80%) to develop the methods and a test set (20%) to obtain the results. Algorithm performance was evaluated using the total accuracy, and the precision and recall per class. For the FSL experiments the algorithms were trained with different numbers of images per class and the experiments were repeated 20 times to statistically characterize the results. The accuracy in the source domain was 91.4% (32 classes), with a median precision/recall per class of 93.8%/92.6%. The accuracy in the target domain was 94.0% (6 classes) learning from all the training data, and the median accuracy (90% confidence interval) learning from 1 image per class was 55.5 (46.0–61.7)%. Median accuracies of 80.0 (76.4–86.5)% and 90.0 (86.1–94.2)% were reached for 15 and 80 images per class, yielding a reduction of 89.1% (80 images/class) in the training dataset with only a 4-point loss in accuracy. The FSL method outperformed the classical fine tuning transfer learning which had accuracies of 18.0 (16.0–24.0)% and 72.0 (68.0–77.3)% for 1 and 80 images per class, respectively. It is possible to learn new plant leaf and disease types with very small datasets using deep learning Siamese networks with Triplet loss, achieving almost a 90% reduction in training data needs and outperforming classical learning techniques for small training sets.
Osaba E., Villar-Rodriguez E., Del Ser J., Nebro A.J., Molina D., LaTorre A., Suganthan P.N., Coello Coello C.A., Herrera F.
2021-07-01 citations by CoLab: 237 Abstract  
In the last few years, the formulation of real-world optimization problems and their efficient solution via metaheuristic algorithms has been a catalyst for a myriad of research studies. In spite of decades of historical advancements on the design and use of metaheuristics, large difficulties still remain in regards to the understandability, algorithmic design uprightness, and performance verifiability of new technical achievements. A clear example stems from the scarce replicability of works dealing with metaheuristics used for optimization, which is often infeasible due to ambiguity and lack of detail in the presentation of the methods to be reproduced. Additionally, in many cases, there is a questionable statistical significance of their reported results. This work aims at providing the audience with a proposal of good practices which should be embraced when conducting studies about metaheuristics methods used for optimization in order to provide scientific rigor, value and transparency. To this end, we introduce a step by step methodology covering every research phase that should be followed when addressing this scientific field. Specifically, frequently overlooked yet crucial aspects and useful recommendations will be discussed in regards to the formulation of the problem, solution encoding, implementation of search operators, evaluation metrics, design of experiments, and considerations for real-world performance, among others. Finally, we will outline important considerations, challenges, and research directions for the success of newly developed optimization metaheuristics in their deployment and operation over real-world application environments.
Mendibil U., Ruiz-Hernandez R., Retegi-Carrion S., Garcia-Urquia N., Olalde-Graells B., Abarrategi A.
2020-07-30 citations by CoLab: 196 PDF Abstract  
The extracellular matrix (ECM) is a complex network with multiple functions, including specific functions during tissue regeneration. Precisely, the properties of the ECM have been thoroughly used in tissue engineering and regenerative medicine research, aiming to restore the function of damaged or dysfunctional tissues. Tissue decellularization is gaining momentum as a technique to obtain potentially implantable decellularized extracellular matrix (dECM) with well-preserved key components. Interestingly, the tissue-specific dECM is becoming a feasible option to carry out regenerative medicine research, with multiple advantages compared to other approaches. This review provides an overview of the most common methods used to obtain the dECM and summarizes the strategies adopted to decellularize specific tissues, aiming to provide a helpful guide for future research development.
Lobo J.L., Del Ser J., Bifet A., Kasabov N.
Neural Networks scimago Q1 wos Q1
2020-01-01 citations by CoLab: 194 Abstract  
Applications that generate huge amounts of data in the form of fast streams are becoming increasingly prevalent, being therefore necessary to learn in an online manner. These conditions usually impose memory and processing time restrictions, and they often turn into evolving environments where a change may affect the input data distribution. Such a change causes that predictive models trained over these stream data become obsolete and do not adapt suitably to new distributions. Specially in these non-stationary scenarios, there is a pressing need for new algorithms that adapt to these changes as fast as possible, while maintaining good performance scores. Unfortunately, most off-the-shelf classification models need to be retrained if they are used in changing environments, and fail to scale properly. Spiking Neural Networks have revealed themselves as one of the most successful approaches to model the behavior and learning potential of the brain, and exploit them to undertake practical online learning tasks. Besides, some specific flavors of Spiking Neural Networks can overcome the necessity of retraining after a drift occurs. This work intends to merge both fields by serving as a comprehensive overview, motivating further developments that embrace Spiking Neural Networks for online learning scenarios, and being a friendly entry point for non-experts.
Benítez-Hidalgo A., Nebro A.J., García-Nieto J., Oregi I., Del Ser J.
2019-12-01 citations by CoLab: 176 Abstract  
This paper describes jMetalPy, an object-oriented Python-based framework for multi-objective optimization with metaheuristic techniques. Building upon our experiences with the well-known jMetal framework, we have developed a new multi-objective optimization software platform aiming not only at replicating the former one in a different programming language, but also at taking advantage of the full feature set of Python, including its facilities for fast prototyping and the large amount of available libraries for data processing, data analysis, data visualization, and high-performance computing. As a result, jMetalPy provides an environment for solving multi-objective optimization problems focused not only on traditional metaheuristics, but also on techniques supporting preference articulation, constrained and dynamic problems, along with a rich set of features related to the automatic generation of statistical data from the results generated, as well as the real-time and interactive visualization of the Pareto front approximations produced by the algorithms. jMetalPy offers additionally support for parallel computing in multicore and cluster systems. We include some use cases to explore the main features of jMetalPy and to illustrate how to work with it.
Diez-Gutiérrez L., San Vicente L., R. Barrón L.J., Villarán M.D., Chávarri M.
Journal of Functional Foods scimago Q1 wos Q2 Open Access
2020-01-01 citations by CoLab: 174 Abstract  
Probiotics have attracted growing interest in recent decades due to their multiple health benefits. The synergistic relationship between probiotics and prebiotics can enhance the production of metabolites called postbiotics, which are gaining increasing importance because of their beneficial functions in the gastrointestinal tract and their influence on different organs and tissues. Notable among the postbiotics is gamma-aminobutyric acid, which plays an essential role in the prevention of neural disease, type 1 diabetes, cancer, immunological disorders and asthma. Generally, gamma-aminobutyric acid is produced by lactic acid bacteria, which under certain conditions can produce a high amount of this amino acid. The food industry has leveraged this capacity to develop functional foods enriched with gamma-aminobutyric acid.
Aurrekoetxea-Arratibel O., Otano-Aramendi N., Valencia-Caballero D., Vidaurrazaga I., Oregi X., Olano-Azkune X.
Fire scimago Q1 wos Q1 Open Access
2025-03-05 citations by CoLab: 0 PDF Abstract  
Solar photovoltaic (PV) systems in buildings must comply with both electrotechnical standards for module safety and local building codes, which typically do not address their electrical nature. This regulatory gap creates challenges in assessing the fire performance of PV systems. This paper presents a procedure to adapt a common test method used in some building codes to assess external fire conditions for roofs, while maintaining operative PV modules. Two configurations were tested: an organic PV thin film on a metallic sandwich panel and a glass–glass-encapsulated organic PV module. The tests were conducted under high voltage and current conditions to simulate the systems’ behavior within a larger PV array. Significant electric arcs were observed during testing of the metallic sandwich panel configuration without glass protection when subjected to high voltages or currents. In these cases, total heat release increased by at least 30% compared to non-electrically loaded scenarios or glass-insulated PV modules, likely due to a greater damaged surface area. Electric arcs created new ignition sources, damaging whole PV modules, whereas in the case with no electrical load, propagation flames advanced toward both the upper edge and the corners of the sample, ultimately damaging the entire triangular area above the fire source. The results indicate that the electrical characteristics of PV systems can significantly impact external fire spread behavior. The study identifies challenges in maintaining system activity during testing and simulating real scenarios and proposes for future research directions.
Xing X., Shi F., Huang J., Wu Y., Nan Y., Zhang S., Fang Y., Roberts M., Schönlieb C., Del Ser J., Yang G.
Nature Machine Intelligence scimago Q1 wos Q1
2025-02-10 citations by CoLab: 1 Abstract  
Generative artificial intelligence (AI) technologies and large models are producing realistic outputs across various domains, such as images, text, speech and music. Creating these advanced generative models requires significant resources, particularly large and high-quality datasets. To minimize training expenses, many algorithm developers use data created by the models themselves as a cost-effective training solution. However, not all synthetic data effectively improve model performance, necessitating a strategic balance in the use of real versus synthetic data to optimize outcomes. Currently, the previously well-controlled integration of real and synthetic data is becoming uncontrollable. The widespread and unregulated dissemination of synthetic data online leads to the contamination of datasets traditionally compiled through web scraping, now mixed with unlabelled synthetic data. This trend, known as the AI autophagy phenomenon, suggests a future where generative AI systems may increasingly consume their own outputs without discernment, raising concerns about model performance, reliability and ethical implications. What will happen if generative AI continuously consumes itself without discernment? What measures can we take to mitigate the potential adverse effects? To address these research questions, this Perspective examines the existing literature, delving into the consequences of AI autophagy, analysing the associated risks and exploring strategies to mitigate its impact. Our aim is to provide a comprehensive perspective on this phenomenon advocating for a balanced approach that promotes the sustainable development of generative AI technologies in the era of large models. With widespread generation and availability of synthetic data, AI systems are increasingly trained on their own outputs, leading to various technical and ethical challenges. The authors analyse this development and discuss measures to mitigate the potential adverse effects of ‘AI eating itself’.
del‐Tejo‐Catala O., Perez J., Garcia N., Perez‐Cortes J., Del Ser J.
Expert Systems scimago Q2 wos Q2
2025-02-05 citations by CoLab: 0 Abstract  
ABSTRACTAnomaly detection is a crucial task in computer vision, with applications ranging from quality control to security monitoring, among many others. Recent technological advancements have enabled near‐perfect solutions on benchmark datasets like MVTec, raising the need for novel datasets that pose new challenges for this modelling task. This work presents a novel Wood Anomaly Detection (WoodAD) dataset, which includes defects in wooden pieces that result in challenges for the most advanced techniques applied to other established datasets. This article evaluates such challenges posed by WoodAD with one‐class and few‐shot supervised learning approaches. Our experiments herein reveal that EfficientAD, a state‐of‐the‐art method previously excelling on the MVTec dataset, outperforms all other one‐class learning approaches. Nevertheless, there is room for improvement, as EfficientAD achieves a 0.535 pixel/segmentation average precision (AP) over the complete test set. UNet, a well‐known pixel‐level classification architecture, leveraged few‐shot supervised learning to enhance the pixel AP score, achieving 0.862 pixel/segmentation AP over the entire test set. Our WoodAD dataset represents a valuable contribution to the field of anomaly detection, offering complex image textures and challenging defects. Researchers and practitioners are encouraged to leverage this dataset to push the boundaries of anomaly detection and develop more robust and effective solutions for more complex real‐world applications. The WoodAD dataset has been made publicly available in Kaggle (https://www.kaggle.com/datasets/itiresearch/wood‐anomaly‐detection‐one‐class‐classification).
Fabiani C., Erkizia E., Snoeck D., Rajczakowska M., Tole I., Ribeiro R.R., Azenha M., Caggiano A., Pisello A.L.
2025-02-05 citations by CoLab: 0 Abstract  
In recent years, substantial progress has been achieved in the development of multifunctional cement-based composites, targeting improved energy efficiency and environmental sustainability while minimizing material depletion. Leveraging the high thermal capacity of these materials facilitates controlled heat storage and release, providing versatile applications in renewable energy management and heat regulation, influencing structural integrity and long-term resistance. Recent research has integrated phase change materials (PCMs) into these composites to harness their superior thermal energy density. This comprehensive review examines the latest experimental research findings on these hybrid materials, emphasizing their thermo-physical behaviour and influence on structural properties and durability. Furthermore, it provides an overview of PCM characteristics and their integration into cement-based matrices. It critically analyses the interaction between PCMs and the cement matrix, explaining effects on structural performance, hydration processes, and freeze–thaw mechanisms. Furthermore, the paper explores recent experimental techniques and protocols for measuring and assessing the structural and thermo-physical properties of these composites. By identifying key trends, the review aims to provide valuable insights into the design and optimization of cement-based composites with PCMs, ultimately enhancing energy efficiency and resource conservation.
Molina D., Poyatos J., Ser J.D., García S., Ishibuchi H., Triguero I., Xue B., Yao X., Herrera F.
2025-01-30 citations by CoLab: 1
Nadeem M., Sohail S.S., Madsen D.Ø., Alzahrani A.A., Ser J.D., Muhammad K.
IEEE Transactions on Big Data scimago Q1 wos Q1
2025-01-30 citations by CoLab: 0
Bidarte I., Galache J.M., Mellado I.
2025-01-27 citations by CoLab: 0
Bartolomé J., Garaizar P., Loizaga E., Bastida L.
Applied Sciences (Switzerland) scimago Q2 wos Q2 Open Access
2025-01-24 citations by CoLab: 0 PDF Abstract  
Background: When measuring complex cognitive constructs, it is crucial to correctly design the evaluation items in order to trigger the intended knowledge and skills. Furthermore, assessing the validity of an assessment requires considering not only the content of the evaluation tasks, but also how examinees perform by engaging construct-relevant response processes. Objectives: We used eye-tracking techniques to examine item response processes in the assessment of digital competence. The eye-tracking observations helped to fill an ‘explanatory gap’ by providing data on the variation in response processes that cannot be captured by other common sources. Method: Specifically, we used eye movement data to validate the inferences made between claimed and observed behavior. This allowed us to interpret how participants processed the information in the items in terms of Area Of Interest (their size, placement, and order). Results and Conclusions: The gaze data provide detailed information about response strategies at the item level, profiling the examinees according to their engagement, response processes and performance/success rate. The presented evidence confirms that the response patterns of the participants who responded well do not represent an alternative to the interpretation of the results that would undermine the assessment criteria. Takeaways: Gaze-based evidence has great potential to provide complementary data about the response processes performed by examinees, thereby contributing to the validity argument.
Perez-Basante A., de Muro A.G., Ordono A., Ceballos S., Unamuno E., Barrena J.A.
2025-01-23 citations by CoLab: 0
Herranz-Pascual K., Anchustegui P., Cantergiani C., Iraurgi I.
Land scimago Q1 wos Q2 Open Access
2025-01-20 citations by CoLab: 0 PDF Abstract  
In recent years, nature-based solutions have been used in urban regeneration interventions to improve the adaptation and resilience of these places, contributing to improved environmental quality and cultural ecosystem functions, including people’s physiological, social, and mental health and wellbeing. However, when it comes to the assessment of psychological wellbeing and social benefits (psychosocial co-benefits), the existing evidence is still limited. To contribute to the advancement of knowledge on nature’s contribution to people in relation to this type of benefit, it is necessary for us to develop and test assessment tools to contribute to the development of a robust nature-based solutions monitoring framework. In this paper, the second phase of the validation of a psychosocial co-benefit assessment tool for nature-based urban interventions is presented. This tool is structured around two dimensions: the perceived health and wellbeing and social co-benefits. The first validation was carried out with experts using the Delphi method. The second validation presented in this paper was based on a sample of users, evaluating a set of eight urban spaces at different levels of naturalisation and openness. The results indicate that the tool is sensitive to the differences in naturalisation and openness in the public urban places analysed. The most relevant contextual variables to explain the psychosocial co-benefits are openness, the surfaces covered by tree branches, the water surface area, and naturalisation.
Martínez-López J., Fernández-Gamiz U., Sánchez-Díez E., Beloki-Arrondo A., Ortega-Fernández Í.
Batteries scimago Q2 wos Q2 Open Access
2025-01-16 citations by CoLab: 0 PDF Abstract  
This study examines the impact of incorporating obstacles in the electrode structure of an organic redox flow battery with a flow-through configuration. Two configurations were compared: A control case without obstacles (Case 1) and a modified design with obstacles to enhance mass transport and uniformity (Case 2). While Case 1 exhibited marginally higher discharge voltages (average difference of 0.18%) due to reduced hydraulic resistance and lower Ohmic losses, Case 2 demonstrated significant improvements in concentration uniformity, particularly at low state-of-charge (SOC) levels. The obstacle design mitigated local depletion of active species, thereby enhancing limiting current density and improving minimum concentration values across the studied SOC range. However, the introduction of obstacles increased flow resistance and pressure drops, indicating a trade-off between electrochemical performance and pumping energy requirements. Notably, Case 2 performed better at lower flow rates, showcasing its potential to optimize efficiency under varying operating conditions. At higher flow rates, the advantages of Case 2 diminished but remained evident, with better concentration uniformity, higher minimum concentration values, and a 1% average increase in limiting current density. Future research should focus on optimizing obstacle geometry and positioning to further enhance performance.
Torres-Barriuso J., Lasarte N., Piñero I., Roji E., Elguezabal P.
Buildings scimago Q1 wos Q2 Open Access
2025-01-15 citations by CoLab: 0 PDF Abstract  
Industrial buildings are a key element in the industrial fabric, and their maintenance is essential to ensure their proper functioning and avoid disruptions and costly economic losses. Continuous maintenance based on an accurate diagnosis makes it possible to meet the challenges of aging infrastructures, which demands a reliable data-based assessment for maintenance management implementing corrective and preventive actions, according to the damage criticality. This paper researches an innovative digitalized process for the inspection and diagnosis of industrial buildings, which leads to categorizing and prioritizing maintenance actions in an objective and cost-effective way from the inspection data. The process integrates some technical developments carried out in this work, aimed to automate the workflow: the drone-based inspection, the building condition assessment from the definition of a standardized construction pathology library, and a visual analysis of pathology evolution based on photogrammetry. The use of drones for digitalized inspection involves some challenges related to the positioning of the drone for damage localization, which has been herein overcome by developing a geo-annotation system for image acquisition. This system has also enabled the capture of geo-located images intended to generate 3D photogrammetric models for quantifying the pathological process evolution. Moreover, the assessment procedure outlined through multi-criteria decision-making methodology MIVES establishes a single criterion to automatically weight the relative importance of the damage defined in the library. As a result, this procedure yields the so-called Intervention Urgency Index (IUI), which allows prioritizing the maintenance actions associated with the damage while also considering economic criteria. In such a way, the overall process aims to increase reliability and consistency in the results of inspection and diagnosis needed for the effective maintenance management of industrial buildings.
Lorenzo L., Pitacco W., Mattar N., Faye I., Maestro B., Ortiz P.
2025-01-14 citations by CoLab: 0 Abstract  
Lignin-derived polyols have been synthesized and scaled-up to industrially relevant settings (5L reactors). They have been used as partial replacement of polyols in polyurethane dispersions for wood coatings.
Zubia G., Zubia J., Amorebieta J., Aldabaldetreku G., Zubia A., Durana G.
Sensors scimago Q1 wos Q2 Open Access
2025-01-12 citations by CoLab: 0 PDF Abstract  
Optical Fiber Displacement Sensors (OFDSs) provide several advantages over conventional sensors, including their compact size, flexibility, and immunity to electromagnetic interference. These features make OFDSs ideal for use in confined spaces, such as turbines, where direct laser access is impossible. A critical aspect of OFDS performance is the geometry of the fiber bundle, which influences key parameters such as sensitivity, range, and dead zones. In this work, we present a streamlined design methodology for azimuthally symmetric OFDSs to improve the linear range of these sensors. The most effective configuration we propose is the pentafurcated bundle, which consists of a central transmitting fiber surrounded by four concentric rings of fibers with different radii. Our experimental results show that the pentafurcated designs increase both the range—up to 10.5 mm—and the sensitivity of the sensor—2mm−1—while minimizing the dead zone of the sensor (2.5 mm), allowing accurate measurements even at very short distances
Xiong M., Chen H., Karaca Y., Jilani Saudagar A.K., Lee I.H., Del Ser J., Muhammad K.
Fractals scimago Q1 wos Q1
2025-01-10 citations by CoLab: 0 Abstract  
Person Re-identification (person Re-ID), as an intelligent video surveillance technology capable of retrieving the same person from different cameras, may bring about some challenges arising from the changes in the person’s poses, different camera views as well as occlusion. Recently, person Re-ID equipped with the attention mechanism has gradually emerged as one of the most active areas of study in the fields of computer vision and fractal feature modeling applications. Despite the upsurge in related research, existing fractal-attention-based methods still face two major challenges when recognizing different pedestrians in unpredictable realistic environments: (1) the adaptability of a single local attention feature to hostile scenes cannot be guaranteed, and (2) the existing methods originating from attention features usually rely on the line mapping or simple variants, which make it difficult to excavate the association relationships among pedestrians with similar appearance attributes. To address these issues, this paper proposes a simple effective fractal feature modeling method, named multi-dimensional attention and spatial adaptive relationship learning framework (MASARF) to explore the correlation between pedestrian bodies for person Re-ID. The proposed framework encompasses a multi-dimensional fractal-attention feature learning model (MDAM) and a dual-branch graph convolutional model (DGCM). In particular, the MDAM comprises the local and global attention modules, which are used to capture multi-dimensional attention features for each person. Subsequently, the DGCM is used to construct the nonlinear mapping association relationships among the various body regions for each person via a dual-branch graph convolutional optimization strategy. Extensive experiments were conducted using public person Re-ID datasets (Market-1501, DukeMTMC-reid, and CUHK-03). The results demonstrate that the performance of the proposed approach is superior to that of state-of-the-art methods between 2% and 10% at Rank-1 (mAP). Essential differences exist between our method and the existing methods in terms of feature extraction and relationship transformation, which provides the validation of its novelty in the person Re-ID domain.

Since 1996

Total publications
3514
Total citations
103613
Citations per publication
29.49
Average publications per year
121.17
Average authors per publication
6.4
h-index
129
Metrics description

Top-30

Fields of science

50
100
150
200
250
300
350
Aquatic Science, 340, 9.68%
Electrical and Electronic Engineering, 333, 9.48%
General Materials Science, 303, 8.62%
General Medicine, 250, 7.11%
Renewable Energy, Sustainability and the Environment, 222, 6.32%
Ecology, Evolution, Behavior and Systematics, 209, 5.95%
Oceanography, 196, 5.58%
Mechanical Engineering, 194, 5.52%
Computer Science Applications, 182, 5.18%
Ecology, 178, 5.07%
Condensed Matter Physics, 161, 4.58%
General Engineering, 140, 3.98%
Mechanics of Materials, 139, 3.96%
General Chemistry, 133, 3.78%
Pollution, 132, 3.76%
Software, 129, 3.67%
Industrial and Manufacturing Engineering, 126, 3.59%
Building and Construction, 126, 3.59%
Energy Engineering and Power Technology, 123, 3.5%
Civil and Structural Engineering, 121, 3.44%
Materials Chemistry, 119, 3.39%
Management, Monitoring, Policy and Law, 113, 3.22%
Control and Systems Engineering, 104, 2.96%
Surfaces, Coatings and Films, 101, 2.87%
Instrumentation, 97, 2.76%
Metals and Alloys, 91, 2.59%
Environmental Engineering, 88, 2.5%
Environmental Chemistry, 87, 2.48%
Multidisciplinary, 83, 2.36%
Biochemistry, 82, 2.33%
50
100
150
200
250
300
350

Journals

20
40
60
80
100
120
20
40
60
80
100
120

Publishers

200
400
600
800
1000
1200
1400
200
400
600
800
1000
1200
1400

With other organizations

200
400
600
800
1000
1200
200
400
600
800
1000
1200

With foreign organizations

10
20
30
40
50
10
20
30
40
50

With other countries

50
100
150
200
250
300
350
400
United Kingdom, 387, 11.01%
Italy, 383, 10.9%
France, 349, 9.93%
Germany, 320, 9.11%
USA, 254, 7.23%
Netherlands, 197, 5.61%
Portugal, 150, 4.27%
Denmark, 123, 3.5%
Greece, 122, 3.47%
Australia, 120, 3.41%
Belgium, 119, 3.39%
Norway, 112, 3.19%
Sweden, 110, 3.13%
Finland, 82, 2.33%
Austria, 74, 2.11%
Canada, 70, 1.99%
Switzerland, 58, 1.65%
Ireland, 55, 1.57%
China, 50, 1.42%
Republic of Korea, 49, 1.39%
Slovenia, 47, 1.34%
Japan, 42, 1.2%
Brazil, 40, 1.14%
Singapore, 38, 1.08%
Poland, 33, 0.94%
Serbia, 33, 0.94%
Mexico, 32, 0.91%
Saudi Arabia, 31, 0.88%
New Zealand, 29, 0.83%
50
100
150
200
250
300
350
400
  • We do not take into account publications without a DOI.
  • Statistics recalculated daily.
  • Publications published earlier than 1996 are ignored in the statistics.
  • The horizontal charts show the 30 top positions.
  • Journals quartiles values are relevant at the moment.