| withColumn {SparkR} | R Documentation |
Return a new SparkDataFrame by adding a column or replacing the existing column that has the same name.
withColumn(x, colName, col) ## S4 method for signature 'SparkDataFrame,character' withColumn(x, colName, col)
x |
a SparkDataFrame. |
colName |
a column name. |
col |
a Column expression, or an atomic vector in the length of 1 as literal value. |
A SparkDataFrame with the new column added or the existing column replaced.
withColumn since 1.4.0
Other SparkDataFrame functions: SparkDataFrame-class,
agg, arrange,
as.data.frame,
attach,SparkDataFrame-method,
cache, checkpoint,
coalesce, collect,
colnames, coltypes,
createOrReplaceTempView,
crossJoin, dapplyCollect,
dapply, describe,
dim, distinct,
dropDuplicates, dropna,
drop, dtypes,
except, explain,
filter, first,
gapplyCollect, gapply,
getNumPartitions, group_by,
head, hint,
histogram, insertInto,
intersect, isLocal,
isStreaming, join,
limit, merge,
mutate, ncol,
nrow, persist,
printSchema, randomSplit,
rbind, registerTempTable,
rename, repartition,
sample, saveAsTable,
schema, selectExpr,
select, showDF,
show, storageLevel,
str, subset,
take, toJSON,
union, unpersist,
with, write.df,
write.jdbc, write.json,
write.orc, write.parquet,
write.stream, write.text
## Not run:
##D sparkR.session()
##D path <- "path/to/file.json"
##D df <- read.json(path)
##D newDF <- withColumn(df, "newCol", df$col1 * 5)
##D # Replace an existing column
##D newDF2 <- withColumn(newDF, "newCol", newDF$col1)
##D newDF3 <- withColumn(newDF, "newCol", 42)
##D # Use extract operator to set an existing or new column
##D df[["age"]] <- 23
##D df[[2]] <- df$col1
##D df[[2]] <- NULL # drop column
## End(Not run)