Home PC Games Linux Windows Database Network Programming Server Mobile  
  Home \ Server \ Spark local development environment to build     - Install Linux Mint 17: 20 things to do (Linux)

- Linux operating system must know the security command (Linux)

- Docker command Detailed (Linux)

- Ten to improve the efficiency of the Linux bash tricks (Linux)

- rsync server set up (Server)

- Use custom backup plans for Debian backupninja (Linux)

- 10 useful Linux command line tips (Linux)

- Mass data storage application of MongoDB database (Database)

- To achieve Linux Security (Linux)

- Python context managers (Programming)

- HBase cluster installation and deployment (Server)

- PostgreSQL query result area is removed and precision (Database)

- Apache Web Security Linux systems (Linux)

- OpenGL Superb Learning Notes - New Patterns (Programming)

- Manually generate AWR reports (Database)

- C language - Traverse pci device (Programming)

- Method under Linux GCC Compiler shared library function export control (Programming)

- OpenGL shadow map (Programming)

- Linux raw socket (Programming)

- 11 you Linux Terminal Command (Linux)

  Spark local development environment to build
  Add Date : 2018-11-21      
  Now many online build spark development environment is based on the idea, the personal habits of the eclipse, or use eclipse to build the development environment. Preparatory work, download Scala IDE for Eclipse version: http: //scala-ide.org/

Scala project version

This method is similar to Java and engineering.

New scala Engineering

Remove scala repository comes in Engineering

Add spark library spark-assembly-1.1.0-cdh5.2.0-Hadoop2.5.0-cdh5.2.0.jar

Modify Project scala compiler version

Right-click -> Scala -> set the Scala Installation

You can also

Right-click the project -> Properties -> Scala Compiler -> Use project Setting, select the corresponding spark scala version, select here Lastest2.10 bundle

Code written in scala

export jar package

Maven project version

Currently the project communities have adopted the basic Maven manage, so Maven to develop ways of learning is still very necessary.

To create a spark project, we first learn to create a scala project under maven it!

Scala Maven project

Create a maven project, modify pom.xml