Your task: extending the demo
At this point we assume that you have a basic handle on the internals of the demo. If not, we recommend reviewing the demo itself before continuing on to this section.
Recall that in the Whatsthis
function of the function tier, we made
a bad assumption: we baked in the idea that the user would only ever
query a model within a single tag, when the code looks awfully close
to supporting multiple tags:
if(parsed_args.tags.size() > 1) {
/**
* TODO: implement support for multiple tags.
* hint: the else branch of this conditional looks like it'd make a good body for a for-loop.
* Why not loop over the parsed_args.tags array?
**/
reply->set_desc("Multiple tags support to be implemented.");
return Status::OK;
} else {
std::size_t tag_index = 0;
// 2 - pass it to the categorizer tier.
node_id_t target = shards[parsed_args.tags[tag_index] % shards.size()][0];
// TODO: add randomness for load-balancing. ^^^^^^^
debug_target_valid(categorizer_tier_handler, target); //(just for debugging)
// 3 - post it to the categorizer tier
responses.emplace_back(
categorizer_tier_handler.p2p_send<RPC_NAME(inference)>(
target, Photo{parsed_args.tags[tag_index],
parsed_args.photo_data,
parsed_args.photo_size}),
target);
#ifndef NDEBUG
std::cout << "p2p_send for inference returned." << std::endl;
std::cout.flush();
#endif
}
Your task is to augment this code so that it can handle mutiple tags.
In particular, we suggest that you remove the then
branch of the
if
block, and convert the else
branch into a for
loop of some
kind. Since Derecho’s RPC requests are asynchronous, it’s fine to
issue many of them in a row before checking the results.
You can test your newly-enhanced inference functionality via the client:
$ ./sospdemo client 127.0.0.1:28000 inference 1,2,3,4 pet-model/pet-2.jpg
Happy hacking!