An Experiment samples a test a number of times. It takes the result of each sample, and puts in a map of the results to count, incrementing the count for each distinct result. The actions to run are permuted each time, to help remove bias about which action is loaded behind the spingate first.
void Experiment::run(size_t count) {
using Actions = decltype(std::declval<Test>().actions());
auto getters = tupleutil::tuple_getters<Actions>();
for (size_t i = 0; i < count; ++i) {
Sample<Test> sample;
sample.run(getters);
resultMap_[sample.result_]++;
std::next_permutation(getters.begin(), getters.end());
}
}
tupleutil::tuple_getters returns an array of getters each of which returns a std::variant<Types…> with the same parameter pack as the tuple.
Sample runs all of the actions in a batch that locks them behind a spingate, and collects the results for each action.
template <class Test> class Sample {
public:
Batch batch_;
Test test_;
typename Test::Result result_;
template <typename V, size_t I> void run(std::array<V, I> const& getters) {
auto const& actions = test_.actions();
add(actions, getters);
batch_.run();
}
};
Add is a templated member function that loops over the array, uses the getter to pull a function out of the tuple of actions and visits that with a lambda that will add either the function with no arguments, or that function with a reference to the results, to the batch.
template <typename Tuple, typename Variant, size_t I>
void add(Tuple const& actions, std::array<Variant, I> const& getters) {
auto adder = [this](auto&& f) {
using F = std::remove_cv_t<std::remove_reference_t<decltype(f)>>;
if constexpr (std::is_invocable_v<F>) {
batch_.add(f);
} else {
batch_.add(f, std::ref(result_));
}
};
for (auto&& get_n : getters) {
std::visit(adder, get_n(actions));
}
return;
}
I am a bit dissatisfied with the else case not being constexpr if followed by a static assert, but getting the condition right didn't work the obvious way, so I punted. There will be a compiler error if f(result_) can't actually be called by the batch.