The paper researches a utilization of the Salp Swarm Algorithm (SSA), a bio-mimetic optimization technique, to improve path planning in Unmanned Ground Vehicles (UGVs). Because of the crucial role of the efficient and reliable path planning in the implementation of UGVs in such sectors as military, rescue operations, and agriculture, there is a need for algorithms that are capable of navigating complex environments. The concept of SSA, based on the natural swarming behavior of salps, represents a very promising approach that is characterized by the exploration and exploitation properties of the algorithm. This study evaluates the performance of the SSA relative to existing particle swarm optimization (PSO), in terms of path optimality, computational efficiency, and dynamic obstacle adaptability, through a number of simulated environments. Results show that the SSA has the potential to compete with the traditional algorithms in path efficiency and computational load. However, PSO shows slight superiority results compared to SSA. This study highlights the potency of bio-inspired algorithms, specifically the SSA, in enhancing the field of autonomous navigation for UGVs. It introduces new possibilities of practical application of SSA in real-life scenarios, demonstrating its scalability and resilience. The findings of this study make a contribution to the general discussion on the improvement of planning of autonomous routes and provide a possible way for more sustainable and effective UGV activities.
In this paper, the design of vehicle door weight is minimized using a newly developed machine learning algorithm that is referred to as Grey Wolf optimizer (GWO). GWO is a metaheuristic technique that show good and robust performance in solving optimization applications. It is a nature-inspired algorithm that is based on the hunting grey wolf while hunting and catching the prey. The algorithm is known to have simple, yet efficient, structure. On the other hand, the design of the car’s door during an impact is known to be a multi-objective optimization problem that is based the European Enhanced Vehicle-Safety Committee. The algorithm needs to minimize the door weight will be satisfying several constrains. The design depends on 11 parameters including the B-pillar inner, B-pillar reinforcement, floor side inner, cross members, door beam, door beltline reinforcement, roof rail, materials of B-pillar inner floor side inner barrier height and hitting position. Monte Carlo simulation is used to test the method and its robustness and stability.
In this paper, Chameleon Swarm Algorithm, which is a new metaheuristic algorithm, is used to design a speed reducer gearbox. The gearbox is used in an autonomous vehicle, and it is supposed to take in consideration the total weight of the gear sets along with other contains that include the stresses; including the bending and surface stresses for the gears and the stresses of the shaft, and he deflections of the shafts, beside other contains. The algorithm must find the optimal solution while satisfying the eleven constraints, which makes it a multi-objective optimization problem. The algorithm minimizes the energy dissipation in the gears as the design is optimal. The algorithm will be tested in Monte Carlo simulation to show its stability and robustness in such an application. The computer resources are examined, and the results are compared to the particle swarm optimization. The proposed method shows better results and better performance compared to the PSO.
Path planning and obstacle avoidance are crucial tasks in the robotics and autonomous industry. Path planning seeks to determine the most efficient path between a start and an end point, whereas obstacle avoidance seeks to avoid collisions with static or dynamic obstacles in the environment. On this work, we utilize the Chameleon Swarm Algorithm (CSA), which is a metaheuristic approach, for path planning and obstacle avoidance on a predetermined map with static obstacles. This CSA extracted the optimal path from several possible different paths, and the results showed that it has slightly superior performance compared to PSO.
Recent interest in unmanned aerial vehicles (UAVs) has grown due to the wide range of possible civilian uses for these aircraft. However, present robot navigation technologies still need to be improved in various situations. Researchers are particularly interested in the 'Sense and Avoid' capacity as a critical issue. UAVs operating in civilian areas must have this functionality to do so safely. Numerous path planning and navigation algorithms have been developed for autonomous decision-making and control of UAVs. These path-planning algorithms are divided into either heuristic and non-heuristic or accurate methods. Both existing UAV route planning algorithms for the first and second techniques will be thoroughly compared in this work. Each algorithm is put through its paces in three diverse obstacle scenarios. Each method has been evaluated under various global and local obstacle information availability conditions while comparing the computational time and solution optimality.
The most prevalent kind of cardiovascular illness is a heart attack, which may or may not have symptoms. The damage to the heart muscle increases with delayed treatment, which increases the risk of mortality. More than 10 million people die each year from heart attacks, and many of them may be avoided if heart attacks could be accurately predicted. To estimate the likelihood of suffering a heart attack, five different machine learning algorithms are used on the Public Health Heart Attack dataset. Several evaluation metrics, including accuracy, recall, precision, ROC curve, and F-score, were used to evaluate the models. All the models—MLP, RBF, SVM, KNN, and RF— achieved significant accuracies of more than 75%, with KNN having the greatest overall performance
In this work, a well-known mechanism that is referred to as piston lever mechanism is design to control the wing’s flange. The design targets the mechanism components and their location while maintaining the minimum oil level required to lift the flanges from 0 to 45 degrees. The design is considered a multi-objective optimization problem; in which we propose the Giza Pyramid Algorithm (GPA) to optimize. GPA is a newly developed metaheuristic technique of type ancientinspired that was developed in 2021. The GPA shall obtain the best design while satisfying four constrains. The algorithm will be tested in Monte Carlo simulation to show its effectiveness in terms of stability and robustness. The performance is then compared to the particle swarm optimization. The proposed method shows a superior result compared to the PSO. Keywords: Giza pyramid, metaheuristic, optimization, design.
The COVID-19 epidemic forced governments to adopt worldwide lockdowns in order to limit the virus's spread. Wearing a face mask, it is said, would reduce the possibility of transmission. Due to the growing urban population, proper city management is more important than ever in the modern day to reduce the impacts of COVID-19 infection. To check the mask in public places, however, would require incredibly long lineups and delays. Therefore, it is necessary for an autonomous mask detection system to assess whether someone is wearing a face mask. On the face mask dataset, three different machine learning methods are applied to determine the likelihood of wearing a face mask. The models were assessed using a number of measures, including accuracy, recall, and ROC curve. The main objective of the study is to detect the presence of face masks using deep learning, machine learning, and image processing approaches. All three models—NB, KNN, and CNN—achieved noteworthy accuracy of more than 80%, with CNN showing the best overall performance.
Heart diseases are ranked the first cause of death in the world. Australia has the highest incidence of heart disease. Approximately 125 lives every single day that’s one life every 12 minutes. Heart disease describes a range of conditions that affect the heart or blood vessels and can affect anyone at any age. Also, a major concern, heart disease could cause a heart attack or stroke. Some symptoms may include chest pain, shortness of death, dizziness, fatigue, or nausea. Other serious symptoms, such as diabetes and high cholesterol, may lead to heart attacks. A healthy lifestyle, quitting smoking, and exercising are small steps to avoid heart disease. Heart diseases are easier to treat when detected early. In this paper, an effective heart disease framework is proposed. Support Vector Machine (SVM), Multilayer Perceptron (MLP), Random Forest (RF), and Radial Basic Function (RBF) techniques for classification are used. Moreover, Feature selection is performed to minimize the features to have better accuracy. Info Gain Attribute Eval – Ranker algorithm is used for feature selection. In addition, classification techniques and feature selection algorithms are applied to the LIAC heart stat log dataset which depends on the heart diseases dataset. The result’s effectiveness is described by accuracy, precision, recall, and ROC Curve.
Short Messaging Service (SMS) becomes a more easy, affordable way to communicate and increasingly replace phone calls. Spam is any kind of unwanted, random unsolicited message that gets sent without any authorization from the receiver. hackers use spam SMS to get their important information. Effective spam detection is an essential tool for assisting users in determining whether an SMS is a spam or not. Different machine learning methods such as Deep Learning techniques have attempted to distinguish between spam and ham SMS texts. This paper proposes the Spam-Ham Classification method using Recurrent Neural Networks (RNN) and Long Short-Term Memories (LSTM). The proposed model utilizes Keras and TensorFlow to detect Spam SMS. The dataset used is SpamSMSCollection from the UCI machine learning repository. The dataset contains a set of 5574 SMS messages. The dataset is preprocessed using tokenization, Lemmatization, padding, and stopword removal. The overall accuracy of the proposed model is 98%. The performance of the proposed method is compared with different machine learning algorithms such as Support Vector Machine (SVM), K-nearest neighbors (KNN), and Multi-layer Perceptron (MLP).
KEYWORDS: Deep learning, Control systems, Education and training, Information technology, Telecommunications, Machine learning, Neural networks, Photonic integrated circuits, Internet of things, Infrared sensors
Cyber Physical Systems (CPS) security within industrial fields have enforced itself, due to its deployment critical infrastructure. The complexity, and diversity are evolved with these CPS systems. While connectivity demands for these systems to communicate with each other increases, their attack surface expands. The impact of cybersecurity has on business continuity increase. ICS can run real time critical function, where firewall inspection delay can fail the process. Hence a special firewall consideration needs to be implemented. The Uptime requirements for these ICS is extremely high, which means the normal maintenance or security patches is out of discussion. Beside a modification on any ICS due to firmware update, can trigger revalidations for all interconnected ICS. Perimeter defense firewall is one of the common strategies to protect these ICS systems. The firewall will inspect and detect ingress traffic. Internal firewall will be more enhanced way to protect also from internal attacks within the network. Hence, a need for more efficient ways to detect these attacks, based on Deep Learning (DL) approach with a good source of (Industrial Internet of Things) IIoT dataset. This conference paper evaluates Deep learning approach using Bi-directional Long Short-Term Memory (BLSTM) on resent publicly available dataset “Edge-IIoTset”. This dataset has realistic dataset of IIoT applications. With more than 10 types of sensors/devices uses in ICS systems. With fourteen attacks including DoS/DDoS attack. In this research paper, we consider utilizing deep learning algorithms (BLSTM) to detect and protect the service availability of Critical Infrastructure (CI) and Industrial Control Systems (ICS) from Denial of Service (DoS)attack. The research proposal considers most recent dataset with packet compared with flow format to train our module. The benchmarking with common metrics is used as baseline to compare algorithm efficiency, where accuracy of 99.877 was achieved and validation time of 18millisconds.
Breast cancer is the second most type of cancer diagnosed in women; it is also the leading cause of cancer caused deaths in women after lung cancer. Breast lumps can be classified as cancerous and non-cancerous. Non-cancerous breast lump development is very common in women. It is important to correctly diagnose the type of breast lump to administer the correct treatment and give the needed care and attention. Intensive research is being done to improve the diagnosis of the type of breast lumps. In this paper we will study different machine learning algorithms for the diagnosis of breast tumors and to predict whether its cancerous or non-cancerous. In this paper we will be building four different classification methods SVM, KNN, RF and CART. We will be using the breast cancer Wisconsin (diagnostic) dataset to train the models. We will base the performance of our models based on the accuracy and other classification evaluation parameters. For the final model we were able to achieve a prediction model with an f1 score of 0.9927.
KEYWORDS: Heart, Machine learning, Detection and tracking algorithms, Data modeling, Binary data, Feature selection, Performance modeling, MATLAB, Data conversion, Medical research
Heart failure (HF) is a common health condition that affects more than 600,000 Americans every year and results in their death. Luckily, machine learning classification, regression and prediction models are key approaches and techniques that can be used to detect and predict the cases of heart disease or failure. The study included in this paper based on a dataset that contains 918 instances or rows of various medical records. This research paper attempts to use these medical records to improve heart failure disease prediction accuracy. For that, multiple popular machine learning models were used to understand the data and provide a better prediction and results, based on different evaluation metrics. Furthermore, the results section in this study shows a better accuracy score compared with other related work using different machine learning algorithms and software. Finally, RStudio and Weka software are used in this paper to perform some of the algorithms and the best model results were using the random forest and logistic regression algorithms. These tools assisted us in better understanding of the data and data preprocessing.
Diabetes mellitus, also known as just diabetes, is a medical condition marked by a high blood sugar level over long period of time. If diabetes left untreated it can result in damaging the nerves, kidney diseases, foot ulcers, damaging eyes, and in worst cases diabetes leads to death. The purpose of this study is to examine and compare numerous machine learning algorithms in order to determine the best forecasting algorithm based on various metrics such as accuracy, precision, recall, F-measure, kappa, sensitivity, and specificity. Four machine learning algorithms will be investigated in this paper such as Random Forest (RF), Support Victor Machine (SVM), K nearest neighbor (k-NN), and Classification and Regression Trees (CART). Algorithms are used in a comprehensive investigation on diabetes dataset. The obtained findings suggest that, when compared to other algorithms, RF provides more accurate predictions.
The state-of-charge (SOC) of Li-ion batteries is an important parameter for regulating and ensuring the safety of batteries, especially for Electric Vehicle applications, where accurate SOC estimation is important for remaining driving range prediction. The SOC is conventionally obtained through different indirect measurement methods that are either impractical or inaccurate. Data-driven approaches for SOC estimation are becoming more popular as an alternative to the conventional estimation methods due to their accuracy and low complexity. In this work, we apply 4 machine learning algorithms: Multiple Linear Regression (MLR), Multilayer Perceptron (MLP), Support Vector Regression (SVR), and Random Forest (RF) to predict the SOC using voltage, current, and temperature measurements from mixed driving cycles datasets, with 50000 instances. Out of the 4 models, the Random forest model performed the best with an MAE of only 0.82%.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.