Remove row with null value from pandas data frame

Question:

I’m trying to remove a row from my data frame in which one of the columns has a value of null. Most of the help I can find relates to removing NaN values which hasn’t worked for me so far.

Here I’ve created the data frame:

  # successfully crated data frame
 df1 = ut.get_data(symbols, dates) # column heads are 'SPY', 'BBD'

# can't get rid of row containing null val in column BBD
# tried each of these with the others commented out but always had an 
# error or sometimes I was able to get a new column of boolean values
# but i just want to drop the row
df1 = pd.notnull(df1['BBD']) # drops rows with null val, not working
df1 = df1.drop(2010-05-04, axis=0)
df1 = df1[df1.'BBD' != null]
df1 = df1.dropna(subset=['BBD'])
df1 = pd.notnull(df1.BBD)


# I know the date to drop but still wasn't able to drop the row
df1.drop([2015-10-30])
df1.drop(['2015-10-30'])
df1.drop([2015-10-30], axis=0)
df1.drop(['2015-10-30'], axis=0)


with pd.option_context('display.max_row', None):
    print(df1)

Here is my output:

Output

Can someone please tell me how I can drop this row, preferably both by identifying the row by the null value and how to drop by date?

I haven’t been working with pandas very long and I’ve been stuck on this for an hour. Any advice would be much appreciated.

Asked By: ryan pickles

||

Answers:

This should do the work:

df = df.dropna(how='any',axis=0) 

It will erase every row (axis=0) that has “any” Null value in it.

EXAMPLE:

#Recreate random DataFrame with Nan values
df = pd.DataFrame(index = pd.date_range('2017-01-01', '2017-01-10', freq='1d'))
# Average speed in miles per hour
df['A'] = np.random.randint(low=198, high=205, size=len(df.index))
df['B'] = np.random.random(size=len(df.index))*2

#Create dummy NaN value on 2 cells
df.iloc[2,1]=None
df.iloc[5,0]=None

print(df)
                A         B
2017-01-01  203.0  1.175224
2017-01-02  199.0  1.338474
2017-01-03  198.0       NaN
2017-01-04  198.0  0.652318
2017-01-05  199.0  1.577577
2017-01-06    NaN  0.234882
2017-01-07  203.0  1.732908
2017-01-08  204.0  1.473146
2017-01-09  198.0  1.109261
2017-01-10  202.0  1.745309

#Delete row with dummy value
df = df.dropna(how='any',axis=0)

print(df)

                A         B
2017-01-01  203.0  1.175224
2017-01-02  199.0  1.338474
2017-01-04  198.0  0.652318
2017-01-05  199.0  1.577577
2017-01-07  203.0  1.732908
2017-01-08  204.0  1.473146
2017-01-09  198.0  1.109261
2017-01-10  202.0  1.745309

See the reference for further detail.

If everything is OK with your DataFrame, dropping NaNs should be as easy as that. If this is still not working, make sure you have the proper datatypes defined for your column (pd.to_numeric comes to mind…)

Answered By: Marjan Moderc

It appears that the value in your column is “null” and not a true NaN which is what dropna is meant for. So I would try:

df[df.BBD != 'null']

or, if the value is actually a NaN then,

df[pd.notnull(df.BBD)]
Answered By: joe-cormier

—-clear null all colum——-

df = df.dropna(how='any',axis=0)

—if you want to clean NULL by based on 1 column.—

df[~df['B'].isnull()]

                A         B
2017-01-01  203.0  1.175224
2017-01-02  199.0  1.338474
                              **2017-01-03  198.0       NaN** clean
2017-01-04  198.0  0.652318
2017-01-05  199.0  1.577577
2017-01-06    NaN  0.234882
2017-01-07  203.0  1.732908
2017-01-08  204.0  1.473146
2017-01-09  198.0  1.109261
2017-01-10  202.0  1.745309

Please forgive any mistakes.

Answered By: Narets

To remove all the null values dropna() method will be helpful

df.dropna(inplace=True)

To remove remove which contain null value of particular use this code

df.dropna(subset=['column_name_to_remove'], inplace=True)

I recommend giving one of these two lines a try:

df_clean = df1[df1['BBD'].isnull() == False]
df_clean = df1[df1['BBD'].isna() == False]
Answered By: Danny
Categories: questions Tags: , , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.