Home PC Games Linux Windows Database Network Programming Server Mobile  
           
  Home \ Server \ Spark local development environment to build     - Nginx Proxy timeout Troubleshooting (Server)

- CentOS 7 Test Marathon start Docker container (Server)

- extundelete: the Linux-based open source data recovery tools (Linux)

- Convert MySQL date string to a NULL value exception handling (Database)

- RabbitMQ tutorial examples: RabbitMQ installation under Windows (Linux)

- C + + secondary pointer memory model (pointer array) (Programming)

- Linux server Php injection prevention (Linux)

- ethtool implementation framework and application in Linux (Linux)

- Ubuntu install virtual machine software VirtualBox 4.3.10 (Linux)

- imp / exp Oracle Database import and export commands (Database)

- Check with Hello World Docker installation (Server)

- MySQL stored procedures and triggers (Database)

- Use in Linux ipmitool tool (Linux)

- Type Linux commands (Linux)

- Transplant spider to MySQL 5.6 (Database)

- RHEL6.5 install the latest version of Vim and increase support for the Python2.7.5 (Linux)

- VMware virtual machine operating system log Error in the RPC receive loop resolve (Linux)

- Joseph Central Java implementation (Programming)

- 11 you Linux Terminal Command (Linux)

- Network security system (Network)

 
         
  Spark local development environment to build
     
  Add Date : 2018-11-21      
         
         
         
  Now many online build spark development environment is based on the idea, the personal habits of the eclipse, or use eclipse to build the development environment. Preparatory work, download Scala IDE for Eclipse version: http: //scala-ide.org/

Scala project version

This method is similar to Java and engineering.

New scala Engineering

Remove scala repository comes in Engineering

Add spark library spark-assembly-1.1.0-cdh5.2.0-Hadoop2.5.0-cdh5.2.0.jar

Modify Project scala compiler version

Right-click -> Scala -> set the Scala Installation

You can also

Right-click the project -> Properties -> Scala Compiler -> Use project Setting, select the corresponding spark scala version, select here Lastest2.10 bundle

Code written in scala

export jar package

Maven project version

Currently the project communities have adopted the basic Maven manage, so Maven to develop ways of learning is still very necessary.

To create a spark project, we first learn to create a scala project under maven it!

Scala Maven project

Create a maven project, modify pom.xml