Machine learning and Artificial Intelligence (AI) are all the rage these days — but with all the buzzwords swirling around them, it’s easy to get lost and not see the difference between hype and reality. For example, just because an algorithm is used to calculate information doesn’t mean the label “Machine Learning” or “Artificial Intelligence” should be applied.
Before we can even define AI or Machine Learning, though, I want to take a step back and define a concept that is at the core of both AI and machine learning: algorithm.
An algorithm is a set of rules to be followed when solving problems. In machine learning, algorithms take in data and perform calculations to find an answer. The calculations can be very simple or they can be more on the complex side. Algorithms should deliver the correct answer in the most efficient manner. What good is an algorithm if it takes longer than a human would to analyze the data? What good is it if it provides incorrect information?
Algorithms need to be trained to learn how to classify and process information. The efficiency and accuracy of the algorithm are dependent on how well the algorithm was trained. Using an algorithm to calculate something does not automatically mean machine learning or AI was being used. All squares are rectangles, but not all rectangles are squares.
To read more go to: http:7wdata.be