👻 Check our latest review to choose the best laptop for Machine Learning engineers and Deep learning tasks!
I am having a lot of trouble understanding how the class_weight
parameter in scikit-learn"s Logistic Regression operates.
The Situation
I want to use logistic regression to do binary classification on a very unbalanced data set. The classes are labelled 0 (negative) and 1 (positive) and the observed data is in a ratio of about 19:1 with the majority of samples having negative outcome.
First Attempt: Manually Preparing Training Data
I split the data I had into disjoint sets for training and testing (about 80/20). Then I randomly sampled the training data by hand to get training data in different proportions than 19:1; from 2:1 -> 16:1.
I then trained logistic regression on these different training data subsets and plotted recall (= TP/(TP+FN)) as a function of the different training proportions. Of course, the recall was computed on the disjoint TEST samples which had the observed proportions of 19:1. Note, although I trained the different models on different training data, I computed recall for all of them on the same (disjoint) test data.
The results were as expected: the recall was about 60% at 2:1 training proportions and fell off rather fast by the time it got to 16:1. There were several proportions 2:1 -> 6:1 where the recall was decently above 5%.
Second Attempt: Grid Search
Next, I wanted to test different regularization parameters and so I used GridSearchCV and made a grid of several values of the C
parameter as well as the class_weight
parameter. To translate my n:m proportions of negative:positive training samples into the dictionary language of class_weight
I thought that I just specify several dictionaries as follows:
{ 0:0.67, 1:0.33 } #expected 2:1
{ 0:0.75, 1:0.25 } #expected 3:1
{ 0:0.8, 1:0.2 } #expected 4:1
and I also included None
and auto
.
This time the results were totally wacked. All my recalls came out tiny (< 0.05) for every value of class_weight
except auto
. So I can only assume that my understanding of how to set the class_weight
dictionary is wrong. Interestingly, the class_weight
value of "auto" in the grid search was around 59% for all values of C
, and I guessed it balances to 1:1?
My Questions
How do you properly use
class_weight
to achieve different balances in training data from what you actually give it? Specifically, what dictionary do I pass toclass_weight
to use n:m proportions of negative:positive training samples?If you pass various
class_weight
dictionaries to GridSearchCV, during cross-validation will it rebalance the training fold data according to the dictionary but use the true given sample proportions for computing my scoring function on the test fold? This is critical since any metric is only useful to me if it comes from data in the observed proportions.What does the
auto
value ofclass_weight
do as far as proportions? I read the documentation and I assume "balances the data inversely proportional to their frequency" just means it makes it 1:1. Is this correct? If not, can someone clarify?
👻 Read also: what is the best laptop for engineering students?
How does the class_weight parameter in scikit-learn work? __del__: Questions
How can I make a time delay in Python?
5 answers
I would like to know how to put a time delay in a Python script.
Answer #1
import time
time.sleep(5) # Delays for 5 seconds. You can also use a float value.
Here is another example where something is run approximately once a minute:
import time
while True:
print("This prints once a minute.")
time.sleep(60) # Delay for 1 minute (60 seconds).
Answer #2
You can use the sleep()
function in the time
module. It can take a float argument for sub-second resolution.
from time import sleep
sleep(0.1) # Time in seconds
How does the class_weight parameter in scikit-learn work? __del__: Questions
How to delete a file or folder in Python?
5 answers
How do I delete a file or folder in Python?
Answer #1
os.remove()
removes a file.os.rmdir()
removes an empty directory.shutil.rmtree()
deletes a directory and all its contents.
Path
objects from the Python 3.4+ pathlib
module also expose these instance methods:
pathlib.Path.unlink()
removes a file or symbolic link.pathlib.Path.rmdir()
removes an empty directory.
We hope this article has helped you to resolve the problem. Apart from How does the class_weight parameter in scikit-learn work?, check other __del__-related topics.
Want to excel in Python? See our review of the best Python online courses 2023. If you are interested in Data Science, check also how to learn programming in R.
By the way, this material is also available in other languages:
- Italiano How does the class_weight parameter in scikit-learn work?
- Deutsch How does the class_weight parameter in scikit-learn work?
- Français How does the class_weight parameter in scikit-learn work?
- Español How does the class_weight parameter in scikit-learn work?
- Türk How does the class_weight parameter in scikit-learn work?
- Русский How does the class_weight parameter in scikit-learn work?
- Português How does the class_weight parameter in scikit-learn work?
- Polski How does the class_weight parameter in scikit-learn work?
- Nederlandse How does the class_weight parameter in scikit-learn work?
- 中文 How does the class_weight parameter in scikit-learn work?
- 한국어 How does the class_weight parameter in scikit-learn work?
- 日本語 How does the class_weight parameter in scikit-learn work?
- हिन्दी How does the class_weight parameter in scikit-learn work?
Rome | 2023-03-25
COM PHP module is always a bit confusing 😭 How does the class_weight parameter in scikit-learn work? is not the only problem I encountered. Checked yesterday, it works!
San Francisco | 2023-03-25
Thanks for explaining! I was stuck with How does the class_weight parameter in scikit-learn work? for some hours, finally got it done 🤗. I just hope that will not emerge anymore
San Francisco | 2023-03-25
Maybe there are another answers? What How does the class_weight parameter in scikit-learn work? exactly means?. Will use it in my bachelor thesis