One day you may face this error:
AccessControlException: Permission denied: user=root, access=WRITE, inode="/user"
Full exception text is:
org.apache.spark.SparkContext - Error initializing SparkContext.
org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
Firstly, you need to check that user from which your Spark working on, has home folder under /user
like that: /user/John_Doe
.
If you already have that folder, try to grant access to /user
dir. If you unable to grant access to that directory or it undesirable in your environment, you can set another directories for Spark to work with:
- Add this to your’s run command:
--conf spark.yarn.stagingDir=hdfs://yourcluster/another/hdfs/path
- After creation of JavaSparkContext, set:
javaSparkContext.setCheckpointDir("/tmp")
or path like that.

If you still have any questions, feel free to ask me in the comments under this article or write me at promark33@gmail.com.
If I saved your day, you can support me 🤝