Detaylı Arama

İptal
Bulunan: 4 Adet 0.001 sn
- Eklemek veya çıkarmak istediğiniz kriterleriniz için 'Dahil' / 'Hariç' seçeneğini kullanabilirsiniz. Sorgu satırları birbirine 'VE' bağlacı ile bağlıdır.
- İptal tuşuna basarak normal aramaya dönebilirsiniz.
Filtreler
Detecting Defected Crops: Precision Agriculture Using Haar Classifiers and UAV

Altınbaş, M.D. | Serif, T.

Conference Object | 2019 | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)11673 LNCS , pp.27 - 40

According to recent studies, the world’s population has doubled since 1960. Furthermore, some projections indicate that the world’s population could reach more than ten billion in the next half of this century. As the world is getting increasingly crowded, the ever-growing need for resources is rising. It appears that depletion of natural resources will be three times more than current rates by the mid-century. People would not only consume more resources but also will need more agricultural produce for their everyday life. Hence, in order to meet the ever-increasing demand for farming products, yield should be maximized using top-e . . .nd technologies. Precision agriculture is the application of technologies and methods to obtain data driven crop management of the farmland. In the middle of the 1980s, precision farming techniques initially were used for soil analysis using sensors and evolved to advanced applications that makes use of satellites, handheld devices and aerial vehicles. Drones commonly referred as unmanned aerial vehicles (UAVs) and have been extensively adopted in precision farming. Consequently, in the last two decades, 80 to 90% of the precision farming operations employed UAVs. Accordingly, this paper proposes a prototype UAV based solution, which can be used to hover over tomato fields, collect visual data and process them to establish meaningful information that can used by the farmers to maximize their crop. Furthermore, the findings of the proposed system showed that this was viable solution and identified the defected tomatoes with the success rate of 90%. © 2019, Springer Nature Switzerland AG Daha fazlası Daha az

An expert system to predict warfarin dosage in Turkish patients depending on genetic and non-genetic factors

Altay, O. | Ulas, M. | Ozer, M. | Genc, E.

Conference Object | 2019 | 7th International Symposium on Digital Forensics and Security, ISDFS 2019 , pp.27 - 40

Warfarin which is a Vitamin K antagonist is one of the most widely used oral anticoagulants worldwide. Genetic factors affecting warfarin (CYP2C9, CYP4F2 and VKORC1) have been shown in different studies. Apart from genetic factors, the effects of age, height, weight and bleeding condition also have been proven. The use of prescribed warfarin drug in the wrong doses leads to irreparable disasters for the patients. The amount of warfarin the patients have to take is determined by the INR machine and this takes a lot of time. Since dose estimation takes a long time with conventional methods, use of data mining algorithms has been propo . . .sed for prediction of warfarin dose. In this paper, unlike previous studies, it was shown that the amount of warfarin was calculated not by numeric prediction but by classification, and better accuracy rates than previous success accuracy rates were obtained. Using the data obtained from the Turkish patients in the study, the dose range required for daily use of the patient's warfarin drug dose was classified by Bayesian and K-Nearest Neighbor (KNN) algorithms. The result of this study using Bayesian algorithm calculated as %59.01 and using KNN algorithm calculated as %50.52. © 2019 IEEE Daha fazlası Daha az

Application of machine learning techniques on prediction of future processor performance

Inal, G. | Kucuk, G.

Conference Object | 2018 | Proceedings - 2018 6th International Symposium on Computing and Networking Workshops, CANDARW 2018 , pp.190 - 195

Today, processors utilize many datapath resources with various sizes. In this study, we focus on single thread microprocessors, and apply machine learning techniques to predict processors' future performance trend by collecting and processing processor statistics. This type of a performance prediction can be useful for many ongoing computer architecture research topics. Today, these studies mostly rely on history-and threshold-based prediction schemes, which collect statistics and decide on new resource configurations depending on the results of those threshold conditions at runtime. The proposed offline training-based machine learn . . .ing methodology is an orthogonal technique, which may further improve the performance of such existing algorithms. We show that our neural network based prediction mechanism achieves around 70% accuracy for predicting performance trend (gain or loss in the near future) of applications. This is a noticeably better result compared to accuracy results obtained by naïve history based prediction models. © 2018 IEEE Daha fazlası Daha az

A machine learning approach for a scalable, energy-efficient utility-based cache partitioning

Guney, I.A. | Yildiz, A. | Bayindir, I.U. | Serdaroglu, K.C. | Bayik, U. | Kucuk, G.

Conference Object | 2015 | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)9137 LNCS , pp.409 - 421

Inmulti- andmany-core processors, a shared Last Level Cache (LLC) is utilized to alleviate the performance problems resulting from long latency memory instructions. However, an unmanaged LLC may become quite useless when the running threads have conflicting interests. In one extreme, a thread can make benefit from every portion of the cache whereas, in the other end, another thread may just want to thrash the whole LLC. Recently, a variety of way-partitioning mechanisms are introduced to improve cache performance. Today, almost all of the studies utilize the Utility-based Cache Partitioning (UCP) algorithm as their allocation policy . . .. However, the UCP look-ahead algorithm, although it provides a better utility measure than its greedy counterpart, requires a very complex hardware circuitry and dissipates a considerable amount of energy at the end of each decision period. In this study, we propose an offline supervised machine learning algorithm that replaces the UCP lookahead circuitry with a circuitry requiring almost negligible hardware and energy cost.Depending on the cache and processor configuration, our thorough analysis and simulation results show that the proposed mechanism reduces up to 5% of the overall transistor count and 5% of the overall processor energy without introducing any performance penalty. © Springer International Publishing Switzerland 2015 Daha fazlası Daha az

6698 sayılı Kişisel Verilerin Korunması Kanunu kapsamında yükümlülüklerimiz ve çerez politikamız hakkında bilgi sahibi olmak için alttaki bağlantıyı kullanabilirsiniz.

creativecommons
Bu site altında yer alan tüm kaynaklar Creative Commons Alıntı-GayriTicari-Türetilemez 4.0 Uluslararası Lisansı ile lisanslanmıştır.
Platforms