Apache Spark function in Python

Function example in Spark PythonPython logo

Displays an example of a map function with Spark.

def my_func(iterator): yield sum(iterator)
  
list = range(1,10)
parallel = sc.parallelize(list, 5)
parallel.mapPartitions(my_func).collect()
 [1, 5, 9, 13, 17]