Research Article
BibTex RIS Cite

A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)

Year 2022, , 49 - 56, 30.06.2022
https://doi.org/10.18100/ijamec.1080843

Abstract

The main motivation of the study is to prevent and optimize the deviations in linear connections with complex calculations related to the previous and next steps. This purpose is used for more stable detection and therefore segmentation of object edge/corner regions in Quality Control Systems with Image Processing and Artificial Intelligence algorithms produced by authors within Alpplas Industrial Investments Inc. The dataset used in this area was originally obtained as a result of the edge approaches of the plastic panels manufactured by Alpplas Inc., extracted from the images taken from the AlpVision Quality Control Machine patented with this research. The data consists entirely of the pixel values of the edge points. Dispersed numeric data sets have quite changeable values, create high complexity and require the computation of formidable correlation. In this study, dispersed numeric data optimized by fitting to linearity. The LFLD (Linear Fitting on Locally Deflection) algorithm developed to solve the problem of linear fitting. Dispersed numeric data can be regulated and could be rendered linearly which is curved line smoothing, or line fitting by desired tolerance values. The LFLD algorithm organizes the data by creating a regular linear line (fitting) from the complex data according to the desired tolerance values.

References

  • [1] Ethem Alpaydın. “Adaptive Com-putation and Machine Learning”. Introduction to Machine Learning, MIT Press, Cambridge, MA, 3 edition, 2014, pp. 79.
  • [2] Y. W. Chang et al., “Training and testing low-degree polynomial data mappings via linear svm”, J. Mach. Learn. Res., 11, pp. 1471–1490, 2010.
  • [3] W. S. Cleveland, “Robust locally weighted regression and smooth-ing scatterplots”, Journal of the American Statistical Association, 74(368), pp. 829–836, 1979.
  • [4] W. S. Cleveland and S. J. Devlin, “Locally weighted regression: An approach to regression analysis by local fitting”, Journal of the American Statistical Association, 83(403), pp. 596–610, 1988.
  • [5] D. Freedman, “Statistical Models: Theory and Practice”, Cambridge University Press, August 2005.
  • [6] H. L. Seal, “Studies in the history of probability and statistics. xv: The historical development of the gauss linear model”, Biometrika, 54(1/2), pp. 1–24, 1967.
  • [7] Caruana, Rich and Virginia R. de Sa. “Benefitting from the Variables that Variable Selection Discards.” J. Mach. Learn. Res., 3, pp. 1245-1264, 2003.
  • [8] V. N. Vapnik, “Conditions for Consistency of Empirical Risk Minimization Principle”. Statistical Learning Theory, Wiley Interscience Publication, 1998, pp. 82.
  • [9] Chatfield, Chris, “Non-linear and non-stationary time series analysis”. M.B. Priestley Academic Press, London, 1989, pp. 237.
  • [10] H. J. Seltman, “Simple Linear Regression”. Experimental Design and Analysis, Carnegie Mellon University, 2013, pp.227.
  • [11] Y. Dodge, “Simple Linear Regression”. The Concise Encyclopedia of Statistics, Springer New York, 2008, pp.491-497.
  • [12] D. M. Lane, “Introduction to Linear Regression”. Introduction to Statistics, David Lane Rich University, 2008, pp.462.
  • [13] K. H. Zou, K. Tuncali, and S. G. Silverman, “Correlation and simple linear regression”, Radiology, 227(3), pp. 617–622, 2003.
  • [14] N. Altman, M. Krzywinski, “Simple linear regression”, Nat Methods, 12(11), pp. 999-1000, 2015.
  • [15] I. Sereda et al., “Segmentation by Neural Networks: Errors and Correction”, Computing Research Repository (CoRR), 2018.
  • [16] W. S. Cleveland and S. J. Devlin, “Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting”, Journal of the American Statistical Association, 83(403), pp. 596-610, 1988.
  • [17] J. Canny, “A Computational Approach to Edge Detection”, IEEE Transactions on Pattern Analysis andMachine Intelligence, PAMI-8 (6), pp. 679-698, 1986.
  • [18] Kaggle, Linear Regression: Signal and Noise, USA [Online]. Available:https://www.kaggle.com/competitions/inclass-signal-and-noise/data, Accessed on: May. 23, 2022.
Year 2022, , 49 - 56, 30.06.2022
https://doi.org/10.18100/ijamec.1080843

Abstract

References

  • [1] Ethem Alpaydın. “Adaptive Com-putation and Machine Learning”. Introduction to Machine Learning, MIT Press, Cambridge, MA, 3 edition, 2014, pp. 79.
  • [2] Y. W. Chang et al., “Training and testing low-degree polynomial data mappings via linear svm”, J. Mach. Learn. Res., 11, pp. 1471–1490, 2010.
  • [3] W. S. Cleveland, “Robust locally weighted regression and smooth-ing scatterplots”, Journal of the American Statistical Association, 74(368), pp. 829–836, 1979.
  • [4] W. S. Cleveland and S. J. Devlin, “Locally weighted regression: An approach to regression analysis by local fitting”, Journal of the American Statistical Association, 83(403), pp. 596–610, 1988.
  • [5] D. Freedman, “Statistical Models: Theory and Practice”, Cambridge University Press, August 2005.
  • [6] H. L. Seal, “Studies in the history of probability and statistics. xv: The historical development of the gauss linear model”, Biometrika, 54(1/2), pp. 1–24, 1967.
  • [7] Caruana, Rich and Virginia R. de Sa. “Benefitting from the Variables that Variable Selection Discards.” J. Mach. Learn. Res., 3, pp. 1245-1264, 2003.
  • [8] V. N. Vapnik, “Conditions for Consistency of Empirical Risk Minimization Principle”. Statistical Learning Theory, Wiley Interscience Publication, 1998, pp. 82.
  • [9] Chatfield, Chris, “Non-linear and non-stationary time series analysis”. M.B. Priestley Academic Press, London, 1989, pp. 237.
  • [10] H. J. Seltman, “Simple Linear Regression”. Experimental Design and Analysis, Carnegie Mellon University, 2013, pp.227.
  • [11] Y. Dodge, “Simple Linear Regression”. The Concise Encyclopedia of Statistics, Springer New York, 2008, pp.491-497.
  • [12] D. M. Lane, “Introduction to Linear Regression”. Introduction to Statistics, David Lane Rich University, 2008, pp.462.
  • [13] K. H. Zou, K. Tuncali, and S. G. Silverman, “Correlation and simple linear regression”, Radiology, 227(3), pp. 617–622, 2003.
  • [14] N. Altman, M. Krzywinski, “Simple linear regression”, Nat Methods, 12(11), pp. 999-1000, 2015.
  • [15] I. Sereda et al., “Segmentation by Neural Networks: Errors and Correction”, Computing Research Repository (CoRR), 2018.
  • [16] W. S. Cleveland and S. J. Devlin, “Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting”, Journal of the American Statistical Association, 83(403), pp. 596-610, 1988.
  • [17] J. Canny, “A Computational Approach to Edge Detection”, IEEE Transactions on Pattern Analysis andMachine Intelligence, PAMI-8 (6), pp. 679-698, 1986.
  • [18] Kaggle, Linear Regression: Signal and Noise, USA [Online]. Available:https://www.kaggle.com/competitions/inclass-signal-and-noise/data, Accessed on: May. 23, 2022.
There are 18 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Research Article
Authors

Mahmut Sami Yasak 0000-0003-4444-161X

Muhammed Said Bilgehan 0000-0002-8706-1943

Publication Date June 30, 2022
Published in Issue Year 2022

Cite

APA Yasak, M. S., & Bilgehan, M. S. (2022). A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers, 10(2), 49-56. https://doi.org/10.18100/ijamec.1080843
AMA Yasak MS, Bilgehan MS. A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers. June 2022;10(2):49-56. doi:10.18100/ijamec.1080843
Chicago Yasak, Mahmut Sami, and Muhammed Said Bilgehan. “A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)”. International Journal of Applied Mathematics Electronics and Computers 10, no. 2 (June 2022): 49-56. https://doi.org/10.18100/ijamec.1080843.
EndNote Yasak MS, Bilgehan MS (June 1, 2022) A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers 10 2 49–56.
IEEE M. S. Yasak and M. S. Bilgehan, “A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)”, International Journal of Applied Mathematics Electronics and Computers, vol. 10, no. 2, pp. 49–56, 2022, doi: 10.18100/ijamec.1080843.
ISNAD Yasak, Mahmut Sami - Bilgehan, Muhammed Said. “A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)”. International Journal of Applied Mathematics Electronics and Computers 10/2 (June 2022), 49-56. https://doi.org/10.18100/ijamec.1080843.
JAMA Yasak MS, Bilgehan MS. A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers. 2022;10:49–56.
MLA Yasak, Mahmut Sami and Muhammed Said Bilgehan. “A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)”. International Journal of Applied Mathematics Electronics and Computers, vol. 10, no. 2, 2022, pp. 49-56, doi:10.18100/ijamec.1080843.
Vancouver Yasak MS, Bilgehan MS. A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers. 2022;10(2):49-56.