知方号

知方号

confusion

confusion

Compute confusion matrix to evaluate the accuracy of a classification.

By definition a confusion matrix (C) is such that (C_{i, j})is equal to the number of observations known to be in group (i) andpredicted to be in group (j).

Thus in binary classification, the count of true negatives is(C_{0,0}), false negatives is (C_{1,0}), true positives is(C_{1,1}) and false positives is (C_{0,1}).

Read more in the User Guide.

Parameters:y_truearray-like of shape (n_samples,)

Ground truth (correct) target values.

y_predarray-like of shape (n_samples,)

Estimated targets as returned by a classifier.

labelsarray-like of shape (n_classes), default=None

List of labels to index the matrix. This may be used to reorderor select a subset of labels.If None is given, those that appear at least oncein y_true or y_pred are used in sorted order.

sample_weightarray-like of shape (n_samples,), default=None

Sample weights.

Added in version 0.18.

normalize{‘true’, ‘pred’, ‘all’}, default=None

Normalizes confusion matrix over the true (rows), predicted (columns)conditions or all the population. If None, confusion matrix will not benormalized.

Returns:Cndarray of shape (n_classes, n_classes)

Confusion matrix whose i-th row and j-thcolumn entry indicates the number ofsamples with true label being i-th classand predicted label being j-th class.

See also

ConfusionMatrixDisplay.from_estimator

Plot the confusion matrix given an estimator, the data, and the label.

ConfusionMatrixDisplay.from_predictions

Plot the confusion matrix given the true and predicted labels.

ConfusionMatrixDisplay

Confusion Matrix visualization.

References

[1]

Wikipedia entry for the Confusion matrix(Wikipedia and other references may use a differentconvention for axes).

Examples

>>> from sklearn.metrics import confusion_matrix>>> y_true = [2, 0, 2, 2, 0, 1]>>> y_pred = [0, 0, 2, 2, 0, 2]>>> confusion_matrix(y_true, y_pred)array([[2, 0, 0], [0, 0, 1], [1, 0, 2]])>>> y_true = ["cat", "ant", "cat", "cat", "ant", "bird"]>>> y_pred = ["ant", "ant", "cat", "cat", "ant", "cat"]>>> confusion_matrix(y_true, y_pred, labels=["ant", "bird", "cat"])array([[2, 0, 0], [0, 0, 1], [1, 0, 2]])

In the binary case, we can extract true positives, etc. as follows:

>>> tn, fp, fn, tp = confusion_matrix([0, 1, 0, 1], [1, 1, 1, 0]).ravel()>>> (tn, fp, fn, tp)(np.int64(0), np.int64(2), np.int64(1), np.int64(1))

版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容, 请发送邮件至lizi9903@foxmail.com举报,一经查实,本站将立刻删除。