Skip to main navigation menu Skip to main content Skip to site footer
×
Español (España) | English
Editorial
Home
Indexing

Vol. 3 (2024)

All Issues
Authors in this issue:

G. Meenalochini, D. Amutha Guka, Ramkumar Sivasakthivel, Manikandan Rajagopal Rafael Thomas-Acaro, Brian Meneses-Claudio Anali Alvarado-Acosta, Jesús Fernández-Saavedra, Brian Meneses-Claudio Renzo Huapaya-Ruiz, Brian Meneses-Claudio Lida Vásquez-Pajuelo, Jhonny Richard Rodriguez-Barboza, Karina Raquel Bartra-Rivero, Edgar Antonio Quintanilla-Alarcón, Wilfredo Vega-Jaime, Eduardo Francisco Chavarri-Joo K.Prathap Kumar, R. Rohini Luz Castillo-Cordero, Milagros Contreras-Chihuán, Brian Meneses-Claudio Lucía Asencios-Trujillo, Djamila Gallegos-Espinoza, Lida Asencios-Trujillo, Livia Piñas-Rivera, Carlos LaRosa-Longobardi, Rosa Perez-Siguas Gilberto Murillo González, German Martínez Prats, Verónica Vázquez Vidal Ismail Ezzerrifi Amrani, Ahmed Lahjouji El Idrissi, Abdelkhalek BAHRI, Ahmad El ALLAOUI Oumaima El Haddadi, Max Chevalier, Bernard Dousset, Ahmad El Allaoui, Anass El Haddadi, Olivier Teste ,

Published: February 8, 2024

Contents

2024-02-07 Original
A Progressive UNDML Framework Model for Breast Cancer Diagnosis and Classification

According to recent research, it is studied that the second most common cause of death for women worldwide is breast cancer. Since it can be incredibly difficult to determine the true cause of breast cancer, early diagnosis is crucial to lowering the disease's fatality rate. Early cancer detection raises the chance of survival by up to 8%. Radiologists look for irregularities in breast images collected from mammograms, X-rays, or MRI scans. Radiologists of all levels struggle to identify features like lumps, masses, and micro-calcifications, which leads to high false-positive and false-negative rates. Recent developments in deep learning and image processing give rise to some optimism for the creation of improved applications for the early diagnosis of breast cancer. A methodological study was carried out in which a new Deep U-Net Segmentation based Convolutional Neural Network, named UNDML framework is developed for identifying and categorizing breast anomalies. This framework involves the operations of preprocessing, quality enhancement, feature extraction, segmentation, and classification. Preprocessing is carried out in this case to enhance the quality of the breast picture input. Consequently, the Deep U-net segmentation methodology is applied to accurately segment the breast image for improving the cancer detection rate. Finally, the CNN mechanism is utilized to categorize the class of breast cancer. To validate the performance of this method, an extensive simulation and comparative analysis have been performed in this work. The obtained results demonstrate that the UNDML mechanism outperforms the other models with increased tumor detection rate and accuracy

 

By G. Meenalochini, D. Amutha Guka, Ramkumar Sivasakthivel, Manikandan Rajagopal

2024-01-10 Original
Technological assistance in highly competitive sports for referee decision making: A systematic literature review.

Introduction: During the last decade, it has become evident that the impact of a referee's decision in professional sports turns out to be a turning point in the outcome of a competition, often generating discomfort among fans and competitors. It is for this reason that technological assistants were implemented in sports to help in referee decision making.
Objective: Review and analyze those technological solutions based on the use of artificial intelligence techniques capable of serving as technological assistants in support of referee decision-making in highly competitive professional sports.
Method: The PICO methodology was used for the selection process of scientific publications of the PRISMA declaration. Finding 21 scientific publications extracted from the SCOPUS database that comply with the proposed guidelines, which were reviewed and analyzed to obtain information with added value.
Results: It was found that the proposed technological assistants reached a level of precision greater than 90% in certain sports. Likewise, those limitations were found that reduce the operational quality of these solutions. As found those algorithms, models, methods and approaches of artificial intelligence most used and recommended for future research studies.
Conclusions: In conclusion, the implementation of technological assistants based on artificial intelligence in referee decision making in professional sports has proven to be an effective tool, achieving significant levels of precision.

By Rafael Thomas-Acaro, Brian Meneses-Claudio

2024-02-10 Original
Transformation and digital challenges in Peru during the COVID-19 pandemic, in the educational sector between 2020 and 2023: Systematic Review

Introduction: Digital transformation in the Peruvian educational sector has experienced a significant boost after facing the COVID-19 pandemic. During the period between 2020 and 2023, various innovative methods have been implemented to ensure the continuity of the academic year.
Objective: Explain how the digital transformation was carried out in the Peruvian educational sector after facing the COVID-19 pandemic to the present (2020 – 2023).
Method: Examples from many institutions, statistical studies and scientific and technological references were taken into account to achieve the objective. Throughout this work we are analyzing the different and innovative methods used by teachers to provide continuity to the academic year and how digital challenges were overcome.
Results: 78 documents from Scopus and Scielo were reviewed, leaving 62 after filtering. These cover 8 categories on the impact of the pandemic on education, the transition to online teaching, job skills, challenges and advantages of virtual education, innovation in higher education, educational evaluation in virtual environments, educational internationalization and challenges for teachers during the COVID-19 pandemic.
Conclusions: In conclusion, the digital transformation in the Peruvian educational sector after the COVID-19 pandemic has been fundamental to guarantee the continuity of the teaching-learning process.

By Anali Alvarado-Acosta, Jesús Fernández-Saavedra, Brian Meneses-Claudio

2024-01-08 Original
Applicable methodologies for business continuity management in IT services: A systematic literature review

Introduction: Currently, information technologies have one characteristic in common: their volatility. This is why it is important that companies have methodologies that allow adequate management of the continuity of the services offered through them.
Objective: In this sense, the purpose of this systematic literature review is to identify the most appropriate methodologies that can be implemented in companies to deal with these unforeseen interruptions.
Method: With a study based on a PICO question, the search for relevant literature in a scientific database was proposed using a search equation based on keywords.
Results: The studies offer qualitative results that mainly allow reducing response times before incidents of unforeseen interruptions, among the most notable is that the proposed systems help increase the success rate of recovery procedures by 80%, allow identifying and apply integration technologies that allow improving business continuity systems, among others. However, there is a knowledge gap for which the implementation of these methods is suggested for future proposals in order to achieve quantitative results that can be presented through metrics.
Conclusions: In conclusion, the present systematic literature review carried out the analysis and a comparison of the methodologies proposed by the authors and analyzes the results achieved in each of them, suggesting that 69% of the articles mention an origin of the associated interruptions to logical failures, 75% of the studies indicate that business continuity plans mostly have a preventive focus and 44% suggest continuous testing of plans to ensure their effectiveness.

By Renzo Huapaya-Ruiz, Brian Meneses-Claudio

2024-02-09 Original
Digital Challenges: The Need to Improve the Use of Information Technologies in Teaching

In the post-pandemic scenario, a study was conducted at I.E. 50499 Justo Barrionuevo Álvarez in Cusco, Peru, to investigate the relationship between the use of information technologies and digital competencies among teachers. With a sample of 54 teachers, a structured questionnaire was administered to assess their competencies. The results revealed a direct positive correlation between the use of technologies and digital competencies, with a Spearman's Rho coefficient of 0.877, indicating a significant relationship. Correlations between the use of information technologies and the dimensions of digital competencies ranged from moderate to high. Significant correlations were observed in areas such as problem-solving (Rho=0.457), information and digital literacy (Rho=0.633), and security (Rho=0.743), among others. These findings suggest that, despite limited experience and limited knowledge of digital technologies among teachers in the institution, there is a notable relationship between the use of these technologies and their digital competencies. This study underscores the need for further training in information technologies for teachers in non-modernized urban contexts and for those who are older adults with limited prior experience in the digital domain. Enhancing digital competencies is crucial for adapting to the educational challenges in this new era of education

By Lida Vásquez-Pajuelo, Jhonny Richard Rodriguez-Barboza, Karina Raquel Bartra-Rivero, Edgar Antonio Quintanilla-Alarcón, Wilfredo Vega-Jaime, Eduardo Francisco Chavarri-Joo

2024-02-08 Original
Resource allocation on periotity based schuduling and improve the security using DSSHA-256

Cloud computing has gained popularity with advancements in virtualization technology and the deployment of 5G. However, scheduling workload in a heterogeneous multi-cloud environment is a complicated process. Users of cloud services want to ensure that their data is secure and private, especially sensitive or proprietary information. Several research works have been proposed to solve the challenges associated with cloud computing. The proposed Adaptive Priority based scheduling (PBS) focuses on reducing data access completion time and computation expense for task scheduling in cloud computing. PBS assigns tasks depending on its size and selects the minimum cost path for data access. It contains a task register, scheduler, and task execution components for efficient task execution. The proposed system also executes a double signature mechanism for data privacy and security in data storage. This study correlates the perfo}rmance of three algorithms, PBS, (Task Requirement Degree) TRD and (recommended a Risk adaptive Access Control) RADAC, in terms of task execution time and makespan time. The experimental results demonstrate that PBS outperforms TRD and RADAC in both metrics, as the number of tasks increases. PBS has a minimum task execution time and a lower makespan time than the othertwo algorithms

 

By K.Prathap Kumar, R. Rohini

2024-01-11 Original
Datamart for the analysis of information in the sales process of the company WC HVAC Engineering

Introduction: Information has become a crucial asset for companies in decision making and performance evaluation. Information technologies, such as Business Intelligence, allow data to be converted into relevant information. The implementation of a Datamart, a specialized database, stands out as a solution to analyze specific data from a business area.
Objective: The main objective is to determine how the implementation of a Datamart affects data analysis in the sales area of the company.
Method: A bibliographic review of various sources was carried out using the PICO keywords. In addition, filters were applied to limit the search to relevant articles published in the last 5 years in Spanish or English. Then, 31 relevant documents that highlighted the implementation of Datamarts in the sales area were evaluated.
Results: Predominant Datamart development methods were identified, such as the Kimball and Hefesto methodologies. Likewise, effectiveness was measured through indicators such as processing time, report generation, user satisfaction and availability of information.
Conclusions: In conclusion, a well-implemented Datamart can be a key tool to improve data management and analysis in the sales area of a company.

By Luz Castillo-Cordero, Milagros Contreras-Chihuán, Brian Meneses-Claudio

2024-01-31 Original
Automatic Mobile Learning System for the Constant Preparation of the Student Community

Introduction: the events that occurred with the pandemic caused a drastic change in all activities with direct contact due to the high risk of contagion, with educational centers being affected by the closure measures and the imposition of virtual classes to continue with student preparation, leading many students to see the need to have a computer to take their classes, eventually showing boredom due to the lack of desire to be in front of a computer, This to a certain extent weakens their interest in learning and affects their learning because mobile devices have become more important due to the various applications that provide students with information. For this reason, we propose mobile learning that allows students to have more information, as well as interaction with different students so that they have the opportunity to learn on a constant basis.

Objective: the objective is to create an automatic mobile learning system for the constant preparation of the student community.

Method: a methodology based on a client-server model to take advantage of the various educational resources accompanied by the good support it provides the subjects for students with the interaction of a mobile application.

Results: through the operation of the system, it was visualized that the tests carried out with the students were presented with an efficiency of 96,70 %,

Conclusions: this system presents a high efficiency that allows to reinforce the subjects that need more prominence in the student’s learning and progress of level through the teacher’s evaluations.

By Lucía Asencios-Trujillo, Djamila Gallegos-Espinoza, Lida Asencios-Trujillo, Livia Piñas-Rivera, Carlos LaRosa-Longobardi, Rosa Perez-Siguas

2023-12-29 Original
Technological disinformation: factors and causes of cybernaut identity theft in the digital world

The contribution of technology in the development of our daily activities has taken a giant step in the dependence of the citizen-technology-society with the integration of the Internet without glimpsing a border. It is therefore necessary to safeguard personal information if you have an active digital life. The identification of the factors and causes that lead to identity theft is a requirement for the technical and operational literacy of citizens, who are easy victims. This article aims to analyze some aspects of causes and factors of identity theft of citizens of the municipality of the center of the State of Tabasco. A quantitative instrument was designed, applied via Internet to a population of 3,158. The results show that citizens are unaware of several aspects of security in the environment of digital services, which, depending on gender, age and level of education, are captive in some scenario of digital insecurity.

By Gilberto Murillo González, German Martínez Prats, Verónica Vázquez Vidal

2024-02-08 Original
A dragonfly algorithm for solving the Fixed Charge Transportation Problem FCTP

The primary focus of this article is dedicated to a thorough investigation of the Fixed Load Transportation Problem (FCTP) and the proposition of an exceedingly efficient resolution method, with a specific emphasis on the achievement of optimal transportation plans within practical time constraints. The FCTP, recognized for its intricate nature, falls into the NP-complete category, notorious for its exponential growth in solution time as the problem's size escalates. Within the realm of combinatorial optimization, metaheuristic techniques like the Dragonfly algorithm and genetic algorithms have garnered substantial acclaim due to their remarkable capacity to deliver high-quality solutions to the challenging FCTP. These techniques demonstrate substantial potential in accelerating the resolution of this formidable problem. The central goal revolves around the exploration of groundbreaking solutions for the Fixed Load Transportation Problem, all while concurrently minimizing the time investment required to attain these optimal solutions. This undertaking necessitates the adept utilization of the Dragonfly algorithm, an algorithm inspired by natural processes, known for its adaptability and robustness in solving complex problems. The FCTP, functioning as an optimization problem, grapples with the multifaceted task of formulating distribution plans for products originating from multiple sources and destined for various endpoints. The overarching aspiration is to minimize overall transportation costs, a challenge that mandates meticulous considerations, including product availability at source locations and demand projections at destination points. The proposed methodology introduces an innovative approach tailored explicitly for addressing the Fixed Charge Transport Problem (FCTP) by harnessing the inherent capabilities of the Dragonfly algorithm. This adaptation of the algorithm's underlying processes is precisely engineered to handle large-scale FCTP instances, with the ultimate objective of unveiling solutions that have hitherto remained elusive. The numerical results stemming from our rigorous experiments unequivocally underscore the remarkable prowess of the Dragonfly algorithm in discovering novel and exceptionally efficient solutions. This demonstration unequivocally reaffirms its effectiveness in overcoming the inherent challenges posed by substantial FCTP instances. In summary, the research represents a significant leap forward in the domain of FCTP solution methodologies by seamlessly integrating the formidable capabilities of the Dragonfly algorithm into the problem-solving process. The insights and solutions presented in this article hold immense promise for significantly enhancing the efficiency and effectiveness of FCTP resolution, ultimately benefiting a broad spectrum of industries and logistics systems, and promising advancements in the optimization of transportation processes.

By Ismail Ezzerrifi Amrani, Ahmed Lahjouji El Idrissi, Abdelkhalek BAHRI, Ahmad El ALLAOUI

2024-02-08 Reviews
Overview on Data Ingestion and Schema Matching

This overview traced the evolution of data management, transitioning from traditional ETL processes to addressing contemporary challenges in Big Data, with a particular emphasis on data ingestion and schema matching. It explored the classification of data ingestion into batch, real-time, and hybrid processing, underscoring the challenges associated with data quality and heterogeneity. Central to the discussion was the role of schema mapping in data alignment, proving indispensable for linking diverse data sources. Recent advancements, notably the adoption of machine learning techniques, were significantly reshaping the landscape. The paper also addressed current challenges, including the integration of new technologies and the necessity for effective schema matching solutions, highlighting the continuously evolving nature of schema matching in the context of Big Data.

By Oumaima El Haddadi, Max Chevalier, Bernard Dousset, Ahmad El Allaoui, Anass El Haddadi, Olivier Teste

Recent Issues

View Archive