Home PC Games Linux Windows Database Network Programming Server Mobile  
           
  Home \ Server \ Spark local development environment to build     - Oracle 11g Export guide problem not an empty table (Database)

- Getting Started with Linux system to learn: how to get the process ID (PID) in the script (Linux)

- Oracle Shared Server Configuration (Database)

- Linux Operating System Security Management describes the top ten tricks (Linux)

- Copy U disk files to the Linux system on a virtual machine (Linux)

- How to install Zephyr Test Management Tools on CentOS 7.x (Server)

- CentOS install Java 1.8 (Linux)

- How to use SVN to manage our source code (Server)

- Cacti monitoring service Nginx (Linux)

- How nodeclub constructed Docker image (Server)

- 12 novice Linux command must learn (Linux)

- Valgrind * not * leak check tool (Linux)

- Java application server WildFly (Server)

- Easy to install Ubuntu 15.04 and Ubuntu 15.04 GNOME on Wayland trial (Linux)

- Efficient running Linux virtual machine Six Tips (Linux)

- C language files update in real time (Programming)

- VMware virtual machine operating system log Error in the RPC receive loop resolve (Linux)

- RabbitMQ user roles and access control (Linux)

- How to Create a file can not be changed under Linux (Linux)

- Make command tutorial (Linux)

 
         
  Spark local development environment to build
     
  Add Date : 2018-11-21      
         
         
         
  Now many online build spark development environment is based on the idea, the personal habits of the eclipse, or use eclipse to build the development environment. Preparatory work, download Scala IDE for Eclipse version: http: //scala-ide.org/

Scala project version

This method is similar to Java and engineering.

New scala Engineering

Remove scala repository comes in Engineering

Add spark library spark-assembly-1.1.0-cdh5.2.0-Hadoop2.5.0-cdh5.2.0.jar

Modify Project scala compiler version

Right-click -> Scala -> set the Scala Installation

You can also

Right-click the project -> Properties -> Scala Compiler -> Use project Setting, select the corresponding spark scala version, select here Lastest2.10 bundle

Code written in scala

export jar package

Maven project version

Currently the project communities have adopted the basic Maven manage, so Maven to develop ways of learning is still very necessary.

To create a spark project, we first learn to create a scala project under maven it!

Scala Maven project

Create a maven project, modify pom.xml