Hi @Lotus88
Thanks for reaching out and for sharing the details of your notebook setup.
You have correctly set the session time zone using:
spark.conf.set("spark.sql.session.timeZone", "Asia/Singapore")
However, the function current_timestamp()
still returns the timestamp in UTC regardless of this setting, especially in PySpark contexts. This setting mainly affects Spark SQL queries, not necessarily PySpark API functions.
To ensure the timestamp reflects the correct local time (UTC+8), you can explicitly convert it using:
from pyspark.sql.functions import current_timestamp, from_utc_timestamp
result_df = select_df.withColumn(
"sys_update_dt", from_utc_timestamp(current_timestamp(), "Asia/Singapore")
)
This ensures that the timestamp column reflects UTC+8 accurately when written to your destination.
Let me know if you would like help validating this with your specific destination system. I’ve also attached screenshots showing the behavior and the fix in action for reference.
Hope this helps! Please feel free to reach out if you have any more questions.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.