我们应该如何在星火SBT文件解决本地依赖性 [英] How should we address local dependencies in sbt files for Spark

查看:257
本文介绍了我们应该如何在星火SBT文件解决本地依赖性的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有这个文件SBT:

 离线:= TRUE
名称:=你好
版本:=1.0
scalaVersion:=2.11.7本地
scalaHome:=一些(文件(/家庭/ Ubuntu的/软件/斯卡拉-2.11.7))
libraryDependencies + =org.apache.spark%%火花核%1.5.0%规定

如何能告诉它使用这个地址星火,而不是使用网络?

  /home/ubuntu/software/spark-1.5.0-bin-hadoop2.6

由于它只是尝试连接到互联网的依赖星火和我的VM不必因安全问题互联网连接。

我最终想要运行这个简单的code:

 进口org.apache.spark.SparkContext._
进口org.apache.spark.api.java._
进口org.apache.spark.api.java.function.Function_
进口org.apache.spark.graphx._
进口org.apache.spark.graphx.lib._
进口org.apache.spark.graphx.PartitionStrategy._
//类PartBQ1 {反对PartBQ1 {
。VAL的conf =新SparkConf()setMaster(火花://10.0.1.31:7077)
             .setAppName(CS-838-Assignment2-问题2)
             .SET(spark.driver.memory,1G也)
             .SET(spark.eventLog.enabled,真)
             .SET(spark.eventLog.dir,/家庭/ Ubuntu的/存储/日志」)
             .SET(spark.executor.memory,21克)
             .SET(spark.executor.cores,4)
             .SET(spark.cores.max,4)
             .SET(spark.task.cpus,1)VAL SC =新SparkContext(CONF = CONF)
VAL sql_ctx =新SQLContext(SC)
VAL图= GraphLoader.edgeListFile(SC,data2.txt)
}


解决方案

这是我花了一个新的MVC 2项目,以解决问题和Spark 1.1的步骤:

1 编译反对MVC 2.0 - 我双重检查的参考,以确保我被链接到MVC 2,而不是1,MVC由于这是一个新的项目,这是不是一个问题。
2. 新增System.Web.Mvc.Html - 我加System.Web.Mvc.Html到星火配置,以确保该命名空间加入到所有的意见

在Global.asax.cs中的Application_Start

  VAR设置=新SparkSettings()
    .SetDebug(真)
    .SetAutomaticEncoding(真)
    .AddAssembly(网络)
    .AddNamespace(Web.Model)
    .AddNamespace(System.Collections.Generic)
    .AddNamespace(System.Linq的)
    .AddNamespace(System.Web.Mvc)
    .AddNamespace(System.Web.Mvc.Html);

这也可以在星火视图引擎块webconfig来完成。

3 添加类型模型 - 确保你键入星火视图模型。在这ASPX与继承完成在页声明,如下所示:

 <%@页标题=LANGUAGE =C#的MasterPageFile =〜/查看/共享/的Site.Master
继承=System.Web.Mvc.ViewPage&所述; MyModelType>中%GT;

在星火:

 <可视数据模型=MyModelType/>

I have this sbt file:

offline := true
name := "hello"
version := "1.0"
scalaVersion := "2.11.7-local"
scalaHome := Some(file("/home/ubuntu/software/scala-2.11.7"))
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

How can I tell it to use this address for Spark rather than using web?

/home/ubuntu/software/spark-1.5.0-bin-hadoop2.6

Because it just tries to connect to internet for Spark dependencies and my VM doesn't have internet access due to security issues.

I eventually want to run this simple code:

import org.apache.spark.SparkContext._
import org.apache.spark.api.java._
import org.apache.spark.api.java.function.Function_
import org.apache.spark.graphx._
import org.apache.spark.graphx.lib._
import org.apache.spark.graphx.PartitionStrategy._
//class PartBQ1{

object PartBQ1{
val conf = new SparkConf().setMaster("spark://10.0.1.31:7077")
             .setAppName("CS-838-Assignment2-Question2")
             .set("spark.driver.memory", "1g")
             .set("spark.eventLog.enabled", "true")
             .set("spark.eventLog.dir", "/home/ubuntu/storage/logs")
             .set("spark.executor.memory", "21g")
             .set("spark.executor.cores", "4")
             .set("spark.cores.max", "4")
             .set("spark.task.cpus", "1")

val sc = new SparkContext(conf=conf)
val sql_ctx = new SQLContext(sc)
val graph = GraphLoader.edgeListFile(sc, "data2.txt")
}

解决方案

These are the steps I took to resolve the issue with a new MVC 2 project and Spark 1.1:

1.Compile against MVC 2.0 - I double checked the references to make sure I was linking to MVC 2 and not MVC 1. Since this was a new project, this was not an issue. 2.Added System.Web.Mvc.Html - I added System.Web.Mvc.Html to the Spark configuration, to make sure that namespace was added to all views.

In Global.asax.cs Application_Start

var settings = new SparkSettings()
    .SetDebug(true)
    .SetAutomaticEncoding(true)
    .AddAssembly("Web")
    .AddNamespace("Web.Model")
    .AddNamespace("System.Collections.Generic")
    .AddNamespace("System.Linq")
    .AddNamespace("System.Web.Mvc")
    .AddNamespace("System.Web.Mvc.Html");

This can also be done in the webconfig in the Spark View Engine block.

3.Add the Typed Model - Make sure you type the Spark View Model. In aspx this is done with the Inherits in the page declaration, like this:

<%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master"
Inherits="System.Web.Mvc.ViewPage<MyModelType>" %>

in Spark:

<viewdata model="MyModelType" />

这篇关于我们应该如何在星火SBT文件解决本地依赖性的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆