Spark for the Impatient: It ain’t R2D2, Obi Wan

Today’s installment is about programming with Spark’s most fundamental construct, the Resilient Distributed Dataset (RDD). But before we get into that I’m going to do something I said at the outset I wouldn’t, namely writing about installing Spark. I decided to discuss it briefly because so many folks get confused by the┬ásimplicity. Yeah, I know: […]