Send the result of a sql statement to a for loop using pyspark?

we are trying to send the sql statement result to a for loop. But we do not know how to do it.
Can someone please suggest.

from pyspark import SparkContext
Mysc =SparkContext()
from pyspark.sql import HiveContext
hc = HiveContext(Mysc)
mydata=hc.sql("select * from mydb.mytable")
hc.sql("describe mydb.mytable").registerTempTable("schema_def")
temp_data=hc.sql("select * from schema_def")
mdata=hc.sql("select col_name from schema_def where data_type<>'string'")

Sign In or Register to comment.