Home PC Games Linux Windows Database Network Programming Server Mobile  
           
  Home \ Server \ Spark local development environment to build     - Computer security protection remove local and remote system log files (Linux)

- Running into the site-wide HTTPS (Server)

- User rights management system under Linux (Linux)

- Analyzing Linux server architecture is 32-bit / 64-bit (Server)

- using the ssh command to check the socket / Network Connections (Linux)

- How to use tmpfs in RHEL / CentOS 7.0 (Linux)

- Depth understanding of Python character set encoding (Programming)

- CentOS 7 source code to compile and install Nginx process record (Server)

- Linux user management (Linux)

- Quick paging ROW_NUMBER conducted (Database)

- 10 important Linux ps command combat (Linux)

- Nginx load balancing configuration (http proxy) (Server)

- IP configuration under Linux (Linux)

- Making Linux root file system problems on-link library (Programming)

- Reported too many open files Linux solutions (Server)

- Ubuntu Linux use MAC binding against ARP attacks (Linux)

- To_teach you three strategies to prevent the LAN IP address theft (Linux)

- Linux filtration empty file command summary (Linux)

- Ubuntu users install the Download Manager software Xdman 5.0 (Linux)

- Linux find command detailing (Linux)

 
         
  Spark local development environment to build
     
  Add Date : 2018-11-21      
         
         
         
  Now many online build spark development environment is based on the idea, the personal habits of the eclipse, or use eclipse to build the development environment. Preparatory work, download Scala IDE for Eclipse version: http: //scala-ide.org/

Scala project version

This method is similar to Java and engineering.

New scala Engineering

Remove scala repository comes in Engineering

Add spark library spark-assembly-1.1.0-cdh5.2.0-Hadoop2.5.0-cdh5.2.0.jar

Modify Project scala compiler version

Right-click -> Scala -> set the Scala Installation

You can also

Right-click the project -> Properties -> Scala Compiler -> Use project Setting, select the corresponding spark scala version, select here Lastest2.10 bundle

Code written in scala

export jar package

Maven project version

Currently the project communities have adopted the basic Maven manage, so Maven to develop ways of learning is still very necessary.

To create a spark project, we first learn to create a scala project under maven it!

Scala Maven project

Create a maven project, modify pom.xml