Home PC Games Linux Windows Database Network Programming Server Mobile  
           
  Home \ Server \ Spark local development environment to build     - Ubuntu 14.04 build Android 5.1 development environment and compiler (Linux)

- Solve the compatibility problem between Linux and Java at the source in bold font (Linux)

- OpenCV 3.0 + Python 2.7 installation and testing under Ubuntu 14.04 (Linux)

- MySQL 5.6 master-slave replication configuration (Database)

- Linux systems for entry-learning: Install closed-source packages in Debian (Linux)

- Getting Started Linux Shell Scripting (Programming)

- Linux environment variables inside (Linux)

- The top command under Linux (Linux)

- RHEL6.4 x86_64 build SVN service (Server)

- PHP with FastCGI and mod_php Comments (Server)

- Close common port to protect server security (Linux)

- Install snort intrusion detection system on Debian (Linux)

- Ubuntu 14.04 Enable root and disable the guest (Linux)

- Install Visual Studio Code in Ubuntu (Linux)

- Disk storage structure and file recovery experiment (FAT file system) (Linux)

- Those functions under Linux you do not know the df command (Linux)

- Grub2: Save Your bootloader (Linux)

- BusyBox making the file system (Linux)

- Oracle ORA-01691 error message, a single data file size limit problem (Database)

- HTTPS Encryption Algorithm (Linux)

 
         
  Spark local development environment to build
     
  Add Date : 2018-11-21      
         
         
         
  Now many online build spark development environment is based on the idea, the personal habits of the eclipse, or use eclipse to build the development environment. Preparatory work, download Scala IDE for Eclipse version: http: //scala-ide.org/

Scala project version

This method is similar to Java and engineering.

New scala Engineering

Remove scala repository comes in Engineering

Add spark library spark-assembly-1.1.0-cdh5.2.0-Hadoop2.5.0-cdh5.2.0.jar

Modify Project scala compiler version

Right-click -> Scala -> set the Scala Installation

You can also

Right-click the project -> Properties -> Scala Compiler -> Use project Setting, select the corresponding spark scala version, select here Lastest2.10 bundle

Code written in scala

export jar package

Maven project version

Currently the project communities have adopted the basic Maven manage, so Maven to develop ways of learning is still very necessary.

To create a spark project, we first learn to create a scala project under maven it!

Scala Maven project

Create a maven project, modify pom.xml