org.lenskit.eval.traintest

## Class SimpleEvaluator

• public class SimpleEvaluator
extends Object

Simplified Java API to train-test evaluation. The train-test evaluator is somewhat difficult to use directly from Java; this class is intended to make it easier.

• ### Constructor Summary

Constructors
Constructor and Description
SimpleEvaluator()
Create a simple evaluator with a custom configuration.
• ### Method Summary

All Methods
Modifier and Type Method and Description
SimpleEvaluator addAlgorithm(AlgorithmInstance algo)
Adds an algorithmInfo to the experiment being built.
SimpleEvaluator addAlgorithm(String name, LenskitConfiguration config)
An algorithm instance constructed with a name and Lenskit configuration
SimpleEvaluator addDataSet(Crossfolder cross)
Adds a crossfolder’s results to the experiment.
SimpleEvaluator addDataSet(DataSet data)
SimpleEvaluator addDataSet(StaticDataSource source, int partitions)
Add a new data set to be cross-folded.
SimpleEvaluator addDataSet(StaticDataSource source, int partitions, double holdout)
Add a new data set to be cross-folded.
SimpleEvaluator addDataSet(StaticDataSource train, StaticDataSource test)
Add a data set to the experiment.
SimpleEvaluator addDataSet(String name, StaticDataSource source, int partitions)
Add a new data set to be cross-folded.
SimpleEvaluator addDataSet(String name, StaticDataSource source, int partitions, double holdout)
Add a new data set to be cross-folded.
SimpleEvaluator addMetric(PredictMetric<?> metric)
Add a metric to the experiment.
Table execute()
If this is called more than once it will call of these commands again and most likely throw an exception.
TrainTestExperiment getExperiment()
Path getWorkDir()
Get the working directory for the evaluator.
SimpleEvaluator setOutput(Path file)
Set an output file for aggregate metrics.
SimpleEvaluator setUserOutput(Path file)
Set an output file for per-user evaluation metrics.
SimpleEvaluator setWorkDir(Path dir)
Set the working directory for the evaluator.
• ### Methods inherited from class java.lang.Object

clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
• ### Constructor Detail

• #### SimpleEvaluator

public SimpleEvaluator()

Create a simple evaluator with a custom configuration.

• ### Method Detail

• #### getWorkDir

public Path getWorkDir()

Get the working directory for the evaluator.

Returns:
The directory in which the evaluator will save its working files.
• #### setWorkDir

public SimpleEvaluator setWorkDir(Path dir)

Set the working directory for the evaluator.

Parameters:
dir - The directory in which the evaluator will save its output and temporary files.
Returns:
The evaluator (for chaining).

public SimpleEvaluator addAlgorithm(AlgorithmInstance algo)

Adds an algorithmInfo to the experiment being built.

If any exception is thrown while the command is called it is rethrown as a runtime error.

Parameters:
algo - The algorithm to add to the experiment.
Returns:
The evaluator (for chaining).

public SimpleEvaluator addAlgorithm(String name,
LenskitConfiguration config)

An algorithm instance constructed with a name and Lenskit configuration

Parameters:
name - The name of the algorithm.
config - Lenskit configuration

public SimpleEvaluator addDataSet(Crossfolder cross)

Adds a crossfolder’s results to the experiment.

Parameters:
cross - The crossfold task.
Returns:
The simple evaluator (for chaining).

public SimpleEvaluator addDataSet(String name,
StaticDataSource source,
int partitions,
double holdout)

Add a new data set to be cross-folded. This method creates a new Crossfolder and passes it to addDataSet(Crossfolder). All crossfold parameters that are not taken as arguments by this method are left at their defaults.

Parameters:
name - The name of the crossfold
source - The source for the crossfold
partitions - The number of partitions
holdout - The holdout fraction
Returns:
Itself for chaining.

public SimpleEvaluator addDataSet(StaticDataSource source,
int partitions,
double holdout)

Add a new data set to be cross-folded. This method creates a new Crossfolder and passes it to addDataSet(Crossfolder). All crossfold parameters that are not taken as arguments by this method are left at their defaults.

Parameters:
source - The source for the crossfold
partitions - The number of partitions
holdout - The holdout fraction
Returns:
Itself for chaining.

public SimpleEvaluator addDataSet(String name,
StaticDataSource source,
int partitions)

Add a new data set to be cross-folded. This method creates a new Crossfolder and passes it to addDataSet(Crossfolder). All crossfold parameters that are not taken as arguments by this method are left at their defaults.

Note: Prior to LensKit 2.2, this method used a holdout fraction of 0.2. In LensKit 2.2, it was changed to use the Crossfolder’s default holdout.

Parameters:
name - The name of the crossfold
source - The source for the crossfold
partitions - The number of partitions
Returns:
Itself for chaining.

public SimpleEvaluator addDataSet(StaticDataSource source,
int partitions)

Add a new data set to be cross-folded. This method creates a new Crossfolder and passes it to addDataSet(Crossfolder). All crossfold parameters that are not taken as arguments by this method are left at their defaults.

Note: Prior to LensKit 2.2, this method used a holdout fraction of 0.2. In LensKit 2.2, it was changed to use the Crossfolder’s default holdout.

Parameters:
source - The source for the crossfold
partitions - The number of partitions
Returns:
Itself for chaining.

public SimpleEvaluator addDataSet(DataSet data)
Parameters:
data - The data set to be added to the command.
Returns:
The simple evaluator (for chaining)

public SimpleEvaluator addDataSet(StaticDataSource train,
StaticDataSource test)

Add a data set to the experiment.

The name for the data source will default to ‘generic-data-source’. Because of this, be careful of calling this method more than once.

Parameters:
train - The source of training data.
test - The source of test data.
Returns:
The evaluator (for chaining).

public SimpleEvaluator addMetric(PredictMetric<?> metric)

Add a metric to the experiment.

Parameters:
metric - The metric to be added.
Returns:
The evaluator (for chaining).
• #### setOutput

public SimpleEvaluator setOutput(Path file)

Set an output file for aggregate metrics.

Parameters:
file - An output file for aggregate metrics.
Returns:
The evaluator (for chaining).
• #### setUserOutput

public SimpleEvaluator setUserOutput(Path file)

Set an output file for per-user evaluation metrics.

Parameters:
file - A file to receive per-user evaluation metrics.
Returns:
The evaluator (for chaining).
• #### getExperiment

public TrainTestExperiment getExperiment()

public Table execute()