代码之家  ›  专栏  ›  技术社区  ›  Neabfi

从真实值和预测值中获得准确度

  •  0
  • Neabfi  · 技术社区  · 6 年前

    我有 predicted_y real_y .

    是否有比以下更快的方法获得准确度:

    from keras import backend as K
    
    accuracy_array = K.eval(keras.metrics.categorical_accuracy(real_y, predicted_y))
    
    print(sum(accuracy_array)/len(accuracy_array))
    
    4 回复  |  直到 6 年前
        1
  •  1
  •   Ioannis Nasios    6 年前

    我建议用 scikit-learn 正如我在评论中提到的那样,为了你的目的。

    实施例1 :

    from sklearn import metrics
    
    results = metrics.accuracy_score(real_y, predicted_y)
    

    你可以得到分类报告,包括 precision , recall , f1-scores .

    例2:

    from sklearn.metrics import classification_report
    
    y_true = [0, 1, 2, 2, 2]
    y_pred = [0, 0, 2, 2, 1]
    target_names = ['class 0', 'class 1', 'class 2']
    print(classification_report(y_true, y_pred, target_names=target_names))
    
                    precision    recall  f1-score   support
    
        class 0       0.50      1.00      0.67         1
        class 1       0.00      0.00      0.00         1
        class 2       1.00      0.67      0.80         3
    
    avg / total       0.70      0.60      0.61         5
    

    最后,对于混淆矩阵,请使用:

    例3:

    from sklearn.metrics import confusion_matrix
    
    y_true = [0, 1, 2, 2, 2]
    y_pred = [0, 0, 2, 2, 1]
    
    confusion_matrix(y_true, y_pred)
    
    array([[1, 0, 0],
           [1, 0, 0],
           [0, 1, 2]])
    
        2
  •  2
  •   Saurabh Agrawal    6 年前

    尝试 accuracy_score scikit-learn .

    import numpy as np
    from sklearn.metrics import accuracy_score
    y_pred = [0, 2, 1, 3]
    y_true = [0, 1, 2, 3]
    accuracy_score(y_true, y_pred)
    
    accuracy_score(y_true, y_pred, normalize=False)
    
        3
  •  0
  •   Neabfi    6 年前

    多亏了塞拉洛克,我发现:

    from sklearn import metrics
    metrics.accuracy_score(real_y.argmax(axis=1), predicted_y.argmax(axis=1))
    
        4
  •  0
  •   sepandhaghighi    6 年前

    我编写了一个用于混乱矩阵分析的python lib,您可以将其用于您的目的。

        >>> from pycm import *
        >>> y_actu = [2, 0, 2, 2, 0, 1, 1, 2, 2, 0, 1, 2] # or y_actu = numpy.array([2, 0, 2, 2, 0, 1, 1, 2, 2, 0, 1, 2])
        >>> y_pred = [0, 0, 2, 1, 0, 2, 1, 0, 2, 0, 2, 2] # or y_pred = numpy.array([0, 0, 2, 1, 0, 2, 1, 0, 2, 0, 2, 2])
        >>> cm = ConfusionMatrix(actual_vector=y_actu, predict_vector=y_pred) # Create CM From Data
        >>> cm.classes
        [0, 1, 2]
        >>> cm.table
        {0: {0: 3, 1: 0, 2: 0}, 1: {0: 0, 1: 1, 2: 2}, 2: {0: 2, 1: 1, 2: 3}}
        >>> print(cm)
        Predict          0        1        2        
        Actual
        0                3        0        0        
        1                0        1        2        
        2                2        1        3        
    
    
    
    
        Overall Statistics : 
    
        95% CI                                                           (0.30439,0.86228)
        Bennett_S                                                        0.375
        Chi-Squared                                                      6.6
        Chi-Squared DF                                                   4
        Conditional Entropy                                              0.95915
        Cramer_V                                                         0.5244
        Cross Entropy                                                    1.59352
        Gwet_AC1                                                         0.38931
        Joint Entropy                                                    2.45915
        KL Divergence                                                    0.09352
        Kappa                                                            0.35484
        Kappa 95% CI                                                     (-0.07708,0.78675)
        Kappa No Prevalence                                              0.16667
        Kappa Standard Error                                             0.22036
        Kappa Unbiased                                                   0.34426
        Lambda A                                                         0.16667
        Lambda B                                                         0.42857
        Mutual Information                                               0.52421
        Overall_ACC                                                      0.58333
        Overall_RACC                                                     0.35417
        Overall_RACCU                                                    0.36458
        PPV_Macro                                                        0.56667
        PPV_Micro                                                        0.58333
        Phi-Squared                                                      0.55
        Reference Entropy                                                1.5
        Response Entropy                                                 1.48336
        Scott_PI                                                         0.34426
        Standard Error                                                   0.14232
        Strength_Of_Agreement(Altman)                                    Fair
        Strength_Of_Agreement(Cicchetti)                                 Poor
        Strength_Of_Agreement(Fleiss)                                    Poor
        Strength_Of_Agreement(Landis and Koch)                           Fair
        TPR_Macro                                                        0.61111
        TPR_Micro                                                        0.58333
    
        Class Statistics :
    
        Classes                                                          0                       1                       2                       
        ACC(Accuracy)                                                    0.83333                 0.75                    0.58333                 
        BM(Informedness or bookmaker informedness)                       0.77778                 0.22222                 0.16667                 
        DOR(Diagnostic odds ratio)                                       None                    4.0                     2.0                     
        ERR(Error rate)                                                  0.16667                 0.25                    0.41667                 
        F0.5(F0.5 score)                                                 0.65217                 0.45455                 0.57692                 
        F1(F1 score - harmonic mean of precision and sensitivity)        0.75                    0.4                     0.54545                 
        F2(F2 score)                                                     0.88235                 0.35714                 0.51724                 
        FDR(False discovery rate)                                        0.4                     0.5                     0.4                     
        FN(False negative/miss/type 2 error)                             0                       2                       3                       
        FNR(Miss rate or false negative rate)                            0.0                     0.66667                 0.5                     
        FOR(False omission rate)                                         0.0                     0.2                     0.42857                 
        FP(False positive/type 1 error/false alarm)                      2                       1                       2                       
        FPR(Fall-out or false positive rate)                             0.22222                 0.11111                 0.33333                 
        G(G-measure geometric mean of precision and sensitivity)         0.7746                  0.40825                 0.54772                 
        LR+(Positive likelihood ratio)                                   4.5                     3.0                     1.5                     
        LR-(Negative likelihood ratio)                                   0.0                     0.75                    0.75                    
        MCC(Matthews correlation coefficient)                            0.68313                 0.2582                  0.16903                 
        MK(Markedness)                                                   0.6                     0.3                     0.17143                 
        N(Condition negative)                                            9                       9                       6                       
        NPV(Negative predictive value)                                   1.0                     0.8                     0.57143                 
        P(Condition positive)                                            3                       3                       6                       
        POP(Population)                                                  12                      12                      12                      
        PPV(Precision or positive predictive value)                      0.6                     0.5                     0.6                     
        PRE(Prevalence)                                                  0.25                    0.25                    0.5                     
        RACC(Random accuracy)                                            0.10417                 0.04167                 0.20833                 
        RACCU(Random accuracy unbiased)                                  0.11111                 0.0434                  0.21007                 
        TN(True negative/correct rejection)                              7                       8                       4                       
        TNR(Specificity or true negative rate)                           0.77778                 0.88889                 0.66667                 
        TON(Test outcome negative)                                       7                       10                      7                       
        TOP(Test outcome positive)                                       5                       2                       5                       
        TP(True positive/hit)                                            3                       1                       3                       
        TPR(Sensitivity, recall, hit rate, or true positive rate)        1.0                     0.33333                 0.5  
    
        >>> cm.matrix()
        Predict          0        1        2        
        Actual
        0                3        0        0        
        1                0        1        2        
        2                2        1        3        
    
        >>> cm.normalized_matrix()
        Predict          0              1              2              
        Actual
        0                1.0            0.0            0.0            
        1                0.0            0.33333        0.66667        
        2                0.33333        0.16667        0.5            
    
    

    链接: PyCM