Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Arrow up icon
GO TO TOP
Practical Machine Learning with R

You're reading from   Practical Machine Learning with R Define, build, and evaluate machine learning models for real-world applications

Arrow left icon
Product type Paperback
Published in Aug 2019
Publisher Packt
ISBN-13 9781838550134
Length 416 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (3):
Arrow left icon
 Jeyaraman Jeyaraman
Author Profile Icon Jeyaraman
Jeyaraman
 Olsen Olsen
Author Profile Icon Olsen
Olsen
 Wambugu Wambugu
Author Profile Icon Wambugu
Wambugu
Arrow right icon
View More author details
Toc

Table of Contents (8) Chapters Close

About the Book 1. An Introduction to Machine Learning FREE CHAPTER 2. Data Cleaning and Pre-processing 3. Feature Engineering 4. Introduction to neuralnet and Evaluation Methods 5. Linear and Logistic Regression Models 6. Unsupervised Learning 1. Appendix

Logistic Regression

In linear regression, we modeled continuous values, such as the price of a home. In (binomial) logistic regression, we apply a logistic sigmoid function to the output, resulting in a value between 0 and 1. This value can be interpreted as the probability that the observation belongs to class 1. By setting a cutoff/threshold (such as 0.5), we can use it as a classifier. This is the same approach we used with the neural networks in the previous chapter. The sigmoid function is , where is the output from the linear regression:

Figure 5.21: A plot of the sigmoid function

Figure 5.21 shows the sigmoid function applied to the output . The dashed line represents our cutoff of 0.5. If the predicted probability is above this line, the observation is predicted to be in class 1, otherwise, it's in class 0.

For logistic regression, we use the generalized version of lm(), called glm(), which can be used for multiple types of regression. As we are performing binary...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Visually different images