Heart disease is one of the leading causes of death worldwide, making early detection and diagnosis essential for effective treatment. With advancements in machine learning (ML) and artificial intelligence (AI), these technologies are being increasingly applied in the medical field, particularly for detecting and predicting heart disease. As AI systems become more complex, it becomes important to distinguish between abstracts generated by AI algorithms and those prepared by human experts. This study aims to develop and assess ML approaches to distinguish between human-written and AI-generated (ChatGPT and NLTK) heart disease abstracts. Using a dataset of 15,000 abstracts (5,000 written by humans, 5,000 reworded by ChatGPT, and 5,000 generated using NLTK), various Natural Language Processing (NLP) techniques, such as tokenization, stop word removal, stemming and lemmatization were applied. The text data was transformed into numerical form using TF-IDF vectorization. Different ML models, including K-nearest neighbors (KNN), support vector machines (SVMs), logistic regression, random forest, decision tree were trained and tested for their classification accuracy. This study highlights the significant potential of ML techniques in ensuring transparency and reliability in AI-driven medical decision-making, especially in the area of heart disease diagnosis.
Soft set theory, as a mathematically rigorous and algebraically expressive formalism, offers a powerful framework for modeling uncertainty, vagueness, and parameter-driven variability. Within this landscape, the present study introduces the soft symmetric difference complement-intersection product, a novel binary operation defined over soft sets whose parameter domains are endowed with a group-theoretic structure. Developed within a strict axiomatic foundation, the operation is proven to satisfy fundamental algebraic properties—such as closure, associativity, commutativity, and idempotency—while maintaining consistency with generalized notions of soft equality and subsethood. Its behavior is thoroughly analyzed with respect to identity and absorbing elements, as well as interactions with null and absolute soft sets, all within the constraints of group-parameterized domains. The findings confirm that the proposed operation forms a coherent and structurally robust algebraic system, thereby enriching the algebraic architecture of soft set theory. Furthermore, this work provides a foundational step toward the formulation of a generalized soft group theory, in which soft sets indexed by group-based parameters emulate classical group behaviors through abstract soft operations. The operation’s full integrability within soft inclusion hierarchies and its alignment with generalized soft equalities highlight its theoretical depth and broaden its potential applications in formal decision-making and algebraic modeling under uncertainty.
Graph theory is a branch of science founded in the 18th century by the renowned mathematician Leonhard Euler, who solved the 'Seven Bridges of Königsberg' problem. Over time, graph theory has been used to solve problems in various scientific fields, introducing many new concepts into this domain. Many measurements have been defined for various purposes in graph theory. Some of these measurements are indices which have been defined for various tasks. Wiener index and degree distance are two of them. It is known that the Wiener index of a molecular graph correlates with certain physical and chemical properties of a molecule. Both indices are based on the concept of distance. In this article, the results of the Wiener and degree distance indices on the basic graph types (path, star, circle, wheel, complete, and complete bipartite graphs) are presented using inductive and deductive methods. Furthermore, the minimum and maximum values, as well as the bounds of these indices, are established. These definitions introduce a new approach to indices that have long been studied and widely applied.
Natural disasters, particularly forest fires, significantly impact societies by causing loss of life and property. Effective crisis management, encompassing disaster response and subsequent mitigation efforts, is critically important. This paper, drawing upon an Artificial Intelligence (AI) based Decision Support System (DSS) developed for natural disasters in Turkey, focuses specifically on forest fire management. The system utilizes historical fire data and machine learning (ML) techniques to predict the impacts of fires, enhance decision-making processes, and provide timely, accurate information to decision-makers. Data on forest fires in Turkey, primarily from the satellite-based NASA FIRMS dataset and atmospheric analysis from ECMWF ERA5, were analyzed and interpreted. Preprocessing steps, including data cleaning and feature extraction, were applied. An XGBoost classification model was developed and evaluated for fire risk prediction, demonstrating high performance in identifying fire-prone regions and their potential intensity. The developed AI-based system determines provincial risk scores, aiming for effective resource allocation for natural disasters. Performance metrics such as accuracy, precision, and F1 score were calculated, and the model's performance was examined. The system culminates in a user-friendly prototype, the Turkey Disaster Management System (TDMS), offering risk-based resource allocation simulations and AI-supported reporting for proactive fire management.
This study introduces a new winding method for Helmholtz coils that solves two key problems: it maintains magnetic field uniformity even when the number of turns is increased, and it optimizes coil design under size constraints. Traditional coils lose uniformity as turns increase due to the radius-to-separation ratio. Our method employs a truncated conical winding approach to keep this ratio optimal, allowing for more turns without sacrificing field quality. We also optimize coil thickness to maximize usable space while maintaining high field strength. Simulations show our technique significantly improves field uniformity and intensity compared to traditional coils, making it ideal for applications requiring compact, high-performance magnetic fields, such as medical imaging and particle accelerators operating under spatial limitations.
The rapid proliferation of electric and hybrid vehicles has become one of the fundamental components of Turkey's sustainable transportation policies. This trend necessitates reliable future projections in terms of energy infrastructure planning, charging station placement, environmental impact reduction, and transformation within the automotive sector. In this study, time series analyses of the number of electric (EV) and hybrid (HV) cars registered in Turkey monthly between 2020M01-2025M10 were obtained, and a forecasting model was developed for monthly periods in 2026. The Autoregressive Integrated Moving Average (ARIMA) model and machine learning algorithms such as Prophet, LSTM, and SVR were used as methods. The findings show that machine learning-based models produce lower prediction errors than ARIMA and that SVR and Prophet perform better. The monthly forecasts for 2026 reveal that electric vehicle (EV) registrations in Turkey will continue to show a strong and accelerating growth trend in the coming period, while the growth rate of hybrid vehicle (HV) registrations will slow down significantly, showing a more limited increase or a stagnant trend.