Can you increase the spark shell output character limit -
when running code in spark shell, convenient have small outputs/a sample printed in shell directly rather putting output in file.
by default, shell truncate such output after (fairly small) given number of characters. there way character limit can increased? i'm running spark 1.2
thanks reading
what mean "output"?
if want print n lines of rdd
use take()
:
myrdd.take(n).foreach(println)
according spark programming guide 1.2.0, function "return array first n elements of dataset. note not executed in parallel. instead, driver program computes elements."
Comments
Post a Comment