amazon web services - How to run a Spark jar file from AWS Console without Spark-Shell -


i'm trying run spark application on aws emr console (amazon web services). scala script compiled in jar takes sparkconf settings parameters or strings:

val sparkconf = new sparkconf()   .setappname("wikipediagraphxpagerank")   .setmaster(args(1))   .set("spark.executor.memory","1g")   .registerkryoclasses(array(classof[prvertex], classof[prmessage])) 

however, don't know how pass master-url parameter , other parameters jar when it's uploaded , set-up cluster. clear, i'm aware if running spark-shell way, i'm windows user , current set-up , work i've done, useful have way pass master url emr cluster in 'steps'.

i don't want use spark-shell, have close deadline , have set-up way , feels small issue of passing master url parameter should possible, considering aws have guide running stand-alone spark applications on emr.

help appreciated!

here instructions on using spark-submit via emr step: https://github.com/awslabs/emr-bootstrap-actions/blob/master/spark/examples/spark-submit-via-step.md


Comments

Popular posts from this blog

css - SVG using textPath a symbol not rendering in Firefox -

Java 8 + Maven Javadoc plugin: Error fetching URL -

order - Notification for user in user account opencart -