Hello,
I want to create a spark session and use a GCS connector via this Hadoop configuration:
spark = SparkSession.builder \
.appName(‘test’) \
.config(“spark.hadoop.google.cloud.auth.service.account.json.keyfile”, path_to_credentials) \
.config(“
spark.hadoop.fs.gs.impl”, “com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem”) \
.config(“
spark.hadoop.fs.AbstractFileSystem.gs.impl”, “com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS”) \
.config(“spark.jars”, “/Users/og/Downloads/gcs/gcs-connector-hadoop2-2.2.15-shaded.jar”) \
.getOrCreate()
Is there any method I can use to use a prefect google cloud block to paste the path to my credentials? or is there any other method ?
Any advice helpful! Thank you.