pyspark.sql.functions.
hours
Partition transform function: A transform for timestamps to partition data into hours.
New in version 3.1.0.
Changed in version 3.4.0: Supports Spark Connect.
Column
target date or timestamp column to work on.
data partitioned by hours.
Notes
This function can be used only in combination with partitionedBy() method of the DataFrameWriterV2.
partitionedBy()
Examples
>>> df.writeTo("catalog.db.table").partitionedBy( ... hours("ts") ... ).createOrReplace()