Naive Bayes is a simple classification technique based on Bayes' theorem that assumes independence between predictors. It works well for large datasets and is easy to build. Some key points:
- It calculates the probability of class membership based on prior probabilities of classes and predictors.
- It is commonly used for text classification like spam filtering due to its speed and accuracy.
- Variants include Gaussian, Multinomial, and Bernoulli Naive Bayes for different data types.
- Limitations include its assumptions of independence and inability to tune parameters, but it remains a popular first approach for classification problems.