Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
615 views
in Technique[技术] by (71.8m points)

pandas - AttributeError for df.apply() when trying to subtract the column mean and divide by the column standard deviation for each column in a dataframe

I have a data frame with roughly 26 columns. For 14 of these columns (all data are floats) I want to determine the mean and standard deviation for each column, then for each value in each column I want to subtract the column mean and divide by the column standard deviation (only for the column to which the value belongs).

I can do this separately for each column like so:

chla_array = df['Chla'].to_numpy()
mean_chla = np.nanmean(chla_array)
std_chla = np.nanstd(chla_array)
df['Chla_standardized'] = (df['Chla'] - mean_chla) / std_chla

Because I have 14 columns to do this for, I am looking for a more concise way of coding this, rather than copy and pasting the above code out thirteen more times and changing the column headers. I was thinking of using df.apply() but I can't get it to work. Here is what I have:

df = df.iloc[:, [11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24]]
df_standardized = df.apply((df - df.mean(skipna=True)) / df.std(skipna=True, ddof=0))

The error I encounter is this:

AttributeError: 'Canyon_dist' is not a valid function for 'Series' object

Where 'Canyon_dist' is the header for the first column the code encounters.

I'm not sure that df.apply is appropriate for what I am trying to achieve, so if there is a more appropriate way of doing this please let me know (perhaps using a for loop?).

I am open to all suggestions and thank you.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)
等待大神答复

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...