elliot.evaluation.metrics.accuracy.AUC package

Submodules

elliot.evaluation.metrics.accuracy.AUC.auc module

This is the implementation of the global AUC metric. It proceeds from a system-wise computation.

class elliot.evaluation.metrics.accuracy.AUC.auc.AUC(recommendations, config, params, eval_objects)[source]

Bases: elliot.evaluation.metrics.base_metric.BaseMetric

Area Under the Curve

This class represents the implementation of the global AUC recommendation metric. Passing ‘AUC’ to the metrics list will enable the computation of the metric.

For further details, please refer to the AUC

Note

This metric does not calculate group-based AUC which considers the AUC scores averaged across users. It is also not limited to k. Instead, it calculates the scores on the entire prediction results regardless the users.

\[\mathrm {AUC} = \frac{\sum\limits_{i=1}^M rank_{i} - \frac {{M} \times {(M+1)}}{2}} {{{M} \times {N}}}\]

\(M\) is the number of positive samples.

\(N\) is the number of negative samples.

\(rank_i\) is the ascending rank of the ith positive sample.

To compute the metric, add it to the config file adopting the following pattern:

simple_metrics: [AUC]
eval()[source]

Evaluation function :return: the overall value of AUC

static name()[source]

Metric Name Getter :return: returns the public name of the metric

static needs_full_recommendations()[source]

elliot.evaluation.metrics.accuracy.AUC.gauc module

This is the implementation of the GroupAUC metric. It proceeds from a user-wise computation, and average the AUC values over the users.

class elliot.evaluation.metrics.accuracy.AUC.gauc.GAUC(recommendations, config, params, eval_objects)[source]

Bases: elliot.evaluation.metrics.base_metric.BaseMetric

Group Area Under the Curve

This class represents the implementation of the GroupAUC recommendation metric. Passing ‘GAUC’ to the metrics list will enable the computation of the metric.

“Deep Interest Network for Click-Through Rate Prediction” KDD ‘18 by Zhou, et al.

For further details, please refer to the paper

Note

It calculates the AUC score of each user, and finally obtains GAUC by weighting the user AUC. It is also not limited to k. Due to our padding for scores_tensor in RankEvaluator with -np.inf, the padding value will influence the ranks of origin items. Therefore, we use descending sort here and make an identity transformation to the formula of AUC, which is shown in auc_ function. For readability, we didn’t do simplification in the code.

\[\mathrm {GAUC} = \frac {{{M} \times {(M+N+1)} - \frac{M \times (M+1)}{2}} - \sum\limits_{i=1}^M rank_{i}} {{M} \times {N}}\]

\(M\) is the number of positive samples.

\(N\) is the number of negative samples.

\(rank_i\) is the descending rank of the ith positive sample.

To compute the metric, add it to the config file adopting the following pattern:

simple_metrics: [GAUC]
eval()[source]

Evaluation function :return: the overall averaged value of AUC

eval_user_metric()[source]

Evaluation function :return: the overall averaged value of AUC per user

static name()[source]

Metric Name Getter :return: returns the public name of the metric

static needs_full_recommendations()[source]

elliot.evaluation.metrics.accuracy.AUC.lauc module

This is the implementation of the Limited AUC metric. It proceeds from a user-wise computation, and average the values over the users.

class elliot.evaluation.metrics.accuracy.AUC.lauc.LAUC(recommendations, config, params, eval_objects)[source]

Bases: elliot.evaluation.metrics.base_metric.BaseMetric

Limited Area Under the Curve

This class represents the implementation of the Limited AUC recommendation metric. Passing ‘LAUC’ to the metrics list will enable the computation of the metric.

“Setting Goals and Choosing Metrics for Recommender System Evaluations” by Gunnar Schröder, et al.

For further details, please refer to the paper

To compute the metric, add it to the config file adopting the following pattern:

simple_metrics: [LAUC]
eval_user_metric()[source]

Evaluation function :return: the overall averaged value of LAUC per user

static name()[source]

Metric Name Getter :return: returns the public name of the metric

Module contents