Abstract: The performance of big data workflows depends on both the workflow mapping scheme, which determines task assignment and container allocation in Hadoop, and the on-node scheduling policy, ...
This setup is designed to replicate the Spark benchmark used in the paper [Benchmarking Weak Memory Models] (https://kar.kent.ac.uk/51638/). This runs Apache Spark ...