Pyspark: Select all columns except particular columns

Question:

I have a large number of columns in a PySpark dataframe, say 200. I want to select all the columns except say 3-4 of the columns. How do I select this columns without having to manually type the names of all the columns I want to select?

Asked By: Tshilidzi Mudau

||

Answers:

In the end, I settled for the following :

  • Drop:

    df.drop('column_1', 'column_2', 'column_3')

  • Select :

    df.select([c for c in df.columns if c not in {'column_1', 'column_2', 'column_3'}])

Answered By: Tshilidzi Mudau
df.drop(*[cols for cols in [list of columns to drop]])

Useful if the list to drop columns is huge. or if the list can be derived programmatically.

Answered By: martand
Categories: questions Tags: , , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.