Home IT Linux Windows Database Network Programming Server Mobile  
           
  Home \ Linux \ How to write a new Git protocol     - Install Redis 2.6 5.5 32 position CentOS error resolved (Linux)

- How LUKS encrypted disk / partition to perform remote incremental backup (Linux)

- Getting Started with Linux system to learn: how to check memory usage of Linux (Linux)

- Linux script commands - terminal recorder (Linux)

- Use exp exported EXP-00091 error (Database)

- Linux-- sub-volume compression and decompression (Linux)

- Linux System Getting Started Learning: the curl set custom HTTP header (Linux)

- MongoDB 3.0 New Features (Database)

- Ubuntu 14.04 and derivative versions of the user install Pantheon Photos 0.1 (Linux)

- CentOS ClamAV antivirus package updates (Linux)

- C # get the current screenshot (Programming)

- Oracle common internal event tracking number (Database)

- MySQL remote connection settings (Database)

- PF_RING 6.0.2 compiled and installed on Redhat 6.3 x86_64 (Linux)

- Python exception summary (Programming)

- HTML5 Fundamentals study notes (Programming)

- How to network to share files between Windows, MAC and Linux (Linux)

- Lucene Getting Started Tutorial (Server)

- Linux System Getting Started Learning: Linux command in w (Linux)

- Web database security tips (Linux)

 
         
  How to write a new Git protocol
     
  Add Date : 2016-05-13      
         
       
         
  Once upon a time, I encountered some problems to keep track of their own documents. Usually, I forgot whether to save the file in your desktop computer, laptop or phone, or stored somewhere in the cloud. Moreover, on the very important information, such as passwords and Bitcoin's key, only plain text messages will be sent it to yourself to make me feel nervous and uneasy.

What I need is to store your data in a git repository, then the git repository stored in one place. I can see the previous versions of the heart and not to mention the data is deleted. More to the most is that I have been able to skillfully use git to upload and download files on different computers.

But, as I said, I do not want to simply upload my key and password to GitHub or BitBucket, even if it is one of the private warehouse.

A cool idea rising in my head: to write a tool to encrypt my warehouse, and then Push it to Backup. Unfortunately, that can not use the git push command as usual, we need to use a command like this:

$ Encrypted-git push http://example.com/
At least, I found git-remote-helpers used to be like that.

Git remote helpers

I found a git remote helpers documents online.

It turned out that if you run the command

$ Git remote add origin asdf: //example.com/repo
$ Git push --all origin
Git will first check whether built asdf agreement, when there is no built-in when found, it checks whether the git-remote-asdf in PATH (environment variable), if in, it will run the git-remote-asdf origin asdf: / /example.com/repo to deal with this session.

Similarly, you can run

$ Git clone asdf :: http: //example.com/repo
Unfortunately, I found a document on a real helper of the details vague, and that's what I need. But then, I found a script called git-remote-testgit.sh in the Git source code, which implements a remote git used to test the secondary system testgit. It is basically the same push and fetch functions from the local file system repository. So to make a git git-remote-asdf origin http://example.com/repo.

git clone testgit :: / existing-repository
versus

git clone / existing-repository
Just the same.

Similarly, you can push or fetch from the local warehouse through testgit agreement.

In this document, we will go git-remote-testgit source Go language and a new helper branch: git-remote-go. Process, I will explain the meaning of the source code, as well as realized in the realization of my own remote helper (git-remote-grave) in all sorts.

Basics

In order to understand the context behind the chapter, let's learn some of the terminology and basic mechanisms.

When we run

$ Git remote add myremote go :: http: //example.com/repo
$ Git push myremote master
Git will run the following command to instantiate a new process

git-remote-go myremote http://example.com/repo
Note: The first parameter is the remote name, the second parameter is the url.

When you run

$ Git clone go :: http: //example.com/repo
The next command will instantiate helper

git-remote-go origin http://example.com/repo
Because the remote origin is automatically created automatically clone warehouse.

When Git to a new process instance of the helper, it will open the pipeline to stdin, stdout and stderr communications. Helper command is delivered through stdin, helper response through stdout. Any helper output on stderr is redirected to the git stderr (it may be a terminal).

The last point I need to explain is how to distinguish between local and remote repository. Usually (but not every time), we run git local repository is where we need remote repository connection.

So we push, we send changes from a local repository (local) to the remote repository. In Fetch, we capture any changes from a remote repository (place) to a local warehouse. In Clone, we cloned the remote repository to the local.

When you run git helper, git GIT_DIR environment variable set to the local Git repository directory (for example: local / .git).

Project to open out

In this article, I assume you have installed Go language, and use the environment variable $ GOPATH to go point to a directory.

Let's create a directory go / src / git-remote-go start. So that we can install our plugin by running go install (Suppose your PATH go / bin in).

With this consciousness inside, we can write the go / src / git-remote-go / main.go the first few lines of code.

package main
import (
"Log"
"Os"
)
func Main () (er error) {
if len (os.Args) <3 {
return fmt.Errorf ( "Usage: git-remote-go remote-name url")
}
remoteName: = os.Args [1]
url: = os.Args [2]
}
func main () {
if err: = Main (); err = nil {!
log.Fatal (err)
}
}
I Main () split open, because we need to return an error when the error handling will become easier. Here we can also use defet, because log.Fatal call os.Exit defer inside but does not call functions.

Now, let's look at the very top of git-remote-testgit file, look what needs to be done next.

#! / Bin / sh
# Copyright (c) 2012 Felipe Contreras
alias = $ 1
url = $ 2
dir = "$ GIT_DIR / testgit / $ alias"
prefix = "refs / testgit / $ alias"
default_refspec = "refs / heads / *: $ {prefix} / heads / *"
refspec = "$ {GIT_REMOTE_TESTGIT_REFSPEC- $ default_refspec}"
test -z "$ refspec" && prefix = "refs"
GIT_DIR = "$ url / .git"
export GIT_DIR
force =
mkdir -p "$ dir"
if test -z "$ GIT_REMOTE_TESTGIT_NO_MARKS"
then
gitmarks = "$ dir / git.marks"
testgitmarks = "$ dir / testgit.marks"
test -e "$ gitmarks" ||> "$ gitmarks"
test -e "$ testgitmarks" ||> "$ testgitmarks"
fi

They call it the alias variable is what we call remoteName. url is the same meaning.

The next statement is:

dir = "$ GIT_DIR / testgit / $ alias"
This creates a namespace in the Git directory to identify testgit agreement and we are using the remote path. By this, testgit following origin can file under the branch and the branch backup following files separate.

Here again, we see this statement:

mkdir -p "$ dir"
Here to ensure that the local directory has been created, if it does not exist is created.

Let's add to create a local directory for our Go program.

// Add "path" to the import list
localdir: = path.Join (os.Getenv ( "GIT_DIR"), "go", remoteName)
if err:! = os.MkdirAll (localdir, 0755); err = nil {
return err
}
Then the script above, we have the following lines:

prefix = "refs / testgit / $ alias"
default_refspec = "refs / heads / *: $ {prefix} / heads / *"
refspec = "$ {GIT_REMOTE_TESTGIT_REFSPEC- $ default_refspec}"
test -z "$ refspec" && prefix = "refs"
Here quick talk about refs.

In git in, refs stored in .git / refs:

.git
 refs
 heads
 master
 remotes
 gravy
 origin
 master
 tags

In the above tree, remotes / origin / master including remote branch origin under the mater number of recent submissions. The heads / master the associated under your local branch mater number of recent submissions. A ref as a pointer to a commit.

refspec you can let me put local refs remote refs map up. In the above code, prefix is ​​remote refs directory will be reserved. If the remote name is the original, then the remote master branch will be designated by the .git / refs / testgit / origin / master. So it is essential to create a namespace specified protocol for remote branch.

The next line is refspec. This line

default_refspec = "refs / heads / *: $ {prefix} / heads / *"
It can be extended into

default_refspec = "refs / heads / *: refs / testgit / $ alias / *"
This means that the remote branch mapping looks like the refs / heads / * (where * indicates an arbitrary text) corresponds to refs / testgit / $ alias / * (where * * will be replaced by the text in front of the representation) . For example, refs / heads / master will be mapped to refs / testgit / origin / master.

Basically, refspec testgit allowed to add a new branch to your tree, such as this:

.git
 refs
 heads
 master
 remotes
 origin
 master
 testgit
 origin
 master
 tags
The next line

refspec = "$ {GIT_REMOTE_TESTGIT_REFSPEC- $ default_refspec}"

The $ refspec set to $ GIT_REMOTE_TESTGIT_REFSPEC, unless it does not exist, or it will become $ default_refspec. So you can test the other refspecs by testgit. We assume have been successfully set up $ default_refspec.

Finally, the next line,

test -z "$ refspec" && prefix = "refs"
As we understand it, it looks like if $ GIT_REMOTE_TESTGIT_REFSPEC exists but is empty, put the $ prefix is ​​set to refs.

We need to own refspec, we need to add this line

refspec: = fmt.Sprintf ( "refs / heads / *: refs / go /% s / *", remoteName)
Followed by the above code, we see

GIT_DIR = "$ url / .git"
export GIT_DIR
Another fact is that about $ GIT_DIR if it has set in the environment variable, then the underlying git will use the environment variable $ GIT_DIR directory as its .git directory, rather than local .git directory. This command causes all future widget git commands can be executed in the context of remote product library.

We convert this into points

if err: = os.Setenv ( "GIT_DIR", path.Join (url, ". git")); err = nil {!
return err
}
Remember, of course, that $ dir and our variables localdir point we are still fetch or push subdirectory.

main block there is also a piece of code

if test -z "$ GIT_REMOTE_TESTGIT_NO_MARKS"
then
gitmarks = "$ dir / git.marks"
testgitmarks = "$ dir / testgit.marks"
test -e "$ gitmarks" ||> "$ gitmarks"
test -e "$ testgitmarks" ||> "$ testgitmarks"
fi
We understand that if $ GIT_REMOTE_TESTGIT_NO_MARKS not set, if the content of the statement will be executed.

The ID file to record information about the process of ref and blob like those passed git fast-export and git fast-import. One thing is very important to note that these identities in a variety of plug-in are the same, so they are stored in the localdir.

Here, $ gitmarks products associated with our local library git written identification, $ testgitmarks is saved remoting written identity.

These two lines are a bit like the use of touch, if the ID file does not exist, create an empty.

test -e "$ gitmarks" ||> "$ gitmarks"
test -e "$ testgitmarks" ||> "$ testgitmarks"
Our own program requires these files, so let's start to write a Touch function.

// Create path as an empty file if it does not exist, otherwise do nothing.
// This works by opening a file in exclusive mode; if it already exists,
// An error will be returned rather than truncating it.
func Touch (path string) error {
file, err: = os.OpenFile (path, os.O_WRONLY | os.O_CREATE | os.O_EXCL, 0666)
if os.IsExist (err) {
returnnil
} Elseif err! = Nil {
return err
}
return file.Close ()
}
Now we can create an identity files.

gitmarks: = path.Join (localdir, "git.marks")
gomarks: = path.Join (localdir, "go.marks")
if err:! = Touch (gitmarks); err = nil {
return err
}
if err:! = Touch (gomarks); err = nil {
return err
}
Then, a problem I encountered is that if for some reason the plug-in fails, the ID file will be in an invalid state remains in. To prevent this, we can first save the original contents of the file, and if the Main () function returns an error if we rewrite them.

// Add "io / ioutil" to imports
originalGitmarks, err: = ioutil.ReadFile (gitmarks)
if err! = nil {
return err
}
originalGomarks, err: = ioutil.ReadFile (gomarks)
if err! = nil {
return err
}
defer func () {
if er! = nil {
ioutil.WriteFile (gitmarks, originalGitmarks, 0666)
ioutil.WriteFile (gomarks, originalGomarks, 0666)
}
} ()
Finally, we can start from the command key.

Command line via the standard input stream stdin passed to the plug-in, which is based on a carriage return at the end of each command and a string. Plug-in through the standard output stream stdout respond to commands; standard error stream stderr is piped to the end user.

The following command to write our own operations.

// Add "bufio" to import list.
stdinReader: = bufio.NewReader (os.Stdin)
for {
// Note that command will include the trailing newline.
command, err: = stdinReader.ReadString ( '\ n')
if err! = nil {
return err
}
switch {
case command == "capabilities \ n":
// ...
case command == "\ n":
returnnil
default:
return fmt.Errorf ( "Received unknown command% q", command)
}
}
 

capabilities command

The first command is to be achieved capabilities. Plug-and-line request can split it in the form of a display command is provided and it supports operations to the end of the line blank.

echo 'import'
echo 'export'
test -n "$ refspec" && echo "refspec $ refspec"
if test -n "$ gitmarks"
then
echo "* import-marks $ gitmarks"
echo "* export-marks $ gitmarks"
fi
test -n "$ GIT_REMOTE_TESTGIT_SIGNED_TAGS" && echo "signed-tags"
test -n "$ GIT_REMOTE_TESTGIT_NO_PRIVATE_UPDATE" && echo "no-private-update"
echo 'option'
echo
The above list is used to support this plug-declared import, import option and command operations. option command allows git to change our lengthy plug-in portion.

signed-tags means that when you create a git fast import flow export command, it will --signed-tags = verbatim passed to git-fast-export.

no-private-update indicates that the update does not require private git ref when it is successful push. I have not seen the need to use this feature.

refspec $ refspec we need to tell git which refspec.

* Import-marks $ gitmarks and * export-marks $ gitmarks mean git should save it to gitmarks generated identity files. * Means if git does not recognize these lines, it must fail to return instead of ignoring them. This is because the plugin depends on the identity of the saved file, and can not work together and do not support the git version.

Let's ignore the signed-tags, no-private-update and option, because they are used in the git-remote-testgit unfinished test, and in this case we do not need them. We can easily achieve the above these, such as:

case command == "capabilities \ n":
fmt.Printf ( "import \ n")
fmt.Printf ( "export \ n")
fmt.Printf ( "refspec% s \ n", refspec)
fmt.Printf ( "* import-marks% s \ n", gitmarks)
fmt.Printf ( "* export-marks% s \ n", gitmarks)
fmt.Printf ( "\ n")
 

command list

The next command list. Instructions for use of this command does not include the use of the capabilities of the command output description of the list, because it is usually necessary plug-ins supported.

When the plug-in receives a command list, it should print out ref remote library on products and each line in this format $ objectname $ refname represented by a series of rows, and followed by a blank line. $ Refname corresponds to the name of the ref, $ objectname is the ref points to. $ Objectname can be hashed once submitted, or use @ $ refname point represents another ref, or use? It represents the value of the ref is not available.

git-remote-testgit is implemented as follows.

git for-each-ref - format = '?% (refname)' 'refs / heads /'
head = $ (git symbolic-ref HEAD)
echo "@ $ head HEAD"
echo
Remember, $ GIT_DIR will trigger git for-each-ref executed in the remote library products, and will print a line for each branch? $ Refname, as well as @ $ head HEAD, where the $ head is pointing products library HEAD name of the ref.

In a conventional library products generally have two branches, namely master and dev branch main development branch, so the above output may like this

? Refs / heads / master
? Refs / heads / development
@ Refs / heads / master HEAD

Let us now have to write them. Write a GitListRefs () function, as we wait to be used again.

// Add "os / exec" and "bytes" to the import list.
// Returns a map of refnames to objectnames.
func GitListRefs () (map [string] string, error) {
out, err: = exec.Command (
"Git", "for-each-ref", "- format =% (objectname)% (refname)",
"Refs / heads /",
) .Output ()
if err! = nil {
returnnil, err
}
lines: = bytes.Split (out, [] byte { '\ n'})
refs: = make (map [string] string, len (lines))
for _, line: = range lines {
fields: = bytes.Split (line, [] byte { ''})
if len (fields) <2 {
break
}
refs [string (fields [1])] = string (fields [0])
}
return refs, nil
}
GitSymbolicRef now write ().

func GitSymbolicRef (name string) (string, error) {
out, err: = exec.Command ( "git", "symbolic-ref", name) .Output ()
if err! = nil {
return "", fmt.Errorf (
"GitSymbolicRef: git symbolic-ref% s:% v", name, out, err)
}
returnstring (bytes.TrimSpace (out)), nil
}
Then this can be implemented as a command list.

case command == "list \ n":
refs, err: = GitListRefs ()
if err! = nil {
return fmt.Errorf ( "command list:% v", err)
}
head, err: = GitSymbolicRef ( "HEAD")
if err! = nil {
return fmt.Errorf ( "command list:% v", err)
}
for refname: = range refs {
fmt.Printf ( "?% s \ n", refname)
}
fmt.Printf ( "@% s HEAD \ n", head)
fmt.Printf ( "\ n")

import command

The next step is to import at the command git clone or fetch be used again. The actual command from batch: It import $ refname as a series of lines, with a blank line to send. When this command is sent to the git auxiliary plug, it will execute the git fast-import in binary form, and pipe the standard output and standard input stdin stdout bind. In other words, the auxiliary plug expect to return to a git fast-export the standard output stream stdout.

Let's look at git-remote-testgit implementation.

# Read all import lines
whiletrue
do
ref = "$ {line # *}"
refs = "$ refs $ ref"
read line
test "$ {line %% *}"! = "import" && break
done
if test -n "$ gitmarks"
then
echo "feature import-marks = $ gitmarks"
echo "feature export-marks = $ gitmarks"
fi
if test -n "$ GIT_REMOTE_TESTGIT_FAILURE"
then
echo "feature done"
exit1
fi
echo "feature done"
git fast-export \
$ {Testgitmarks: + "- import-marks = $ testgitmarks"} \
$ {Testgitmarks: + "- export-marks = $ testgitmarks"} \
$ Refs |
sed -e "s # refs / heads / # $ {prefix} / heads / # g"
echo "done"
The top of the cycle, as the comments said, the entire import $ refname command aggregated into a single variable $ refs, whereas $ refs is based on space-separated list.

Next, if the script is being used gitmarks file (assuming that is the case), the output will feature import-marks = $ gitmarks and feature export-marks = $ gitmarks. Here to tell git need to --import-marks = $ gitmarks and --export-marks = $ gitmarks passed to git fast-import.

And then the next line, if for testing purposes provided $ GIT_REMOTE_TESTGIT_FAILURE, plug-in will fail.

After that, feature done will output, suggesting output will keep export streaming content.

Finally, git fast-export is called remote article database settings specified ID file and $ testgitmarks on remote identification, then we need to return to the list of exported ref.

Output git-fast-export command through the pipeline through the refs / heads / match to refs / testgit / $ alias / heads / The sed command. Therefore, when export export, we pass to git's refspec will be able to use this good match mapping.

Behind the export flow, keeping done output.

We can go to try.

case strings.HasPrefix (command, "import"):
refs: = make ([] string, 0)
for {
// Have to make sure to trim the trailing newline.
ref: = strings.TrimSpace (strings.TrimPrefix (command, "import"))
refs = append (refs, ref)
command, err = stdinReader.ReadString ( '\ n')
if err! = nil {
return err
}
if! strings.HasPrefix (command, "import") {
break
}
}
fmt.Printf ( "feature import-marks =% s \ n", gitmarks)
fmt.Printf ( "feature export-marks =% s \ n", gitmarks)
fmt.Printf ( "feature done \ n")
args: = [] string {
"Fast-export",
"--import-Marks", gomarks,
"--export-Marks", gomarks,
"--refspec", Refspec}
args = append (args, refs ...)
cmd: = exec.Command ( "git", args ...)
cmd.Stderr = os.Stderr
cmd.Stdout = os.Stdout
if err: = cmd.Run (); err = nil {!
return fmt.Errorf ( "command import: git fast-export:% v", err)
}
fmt.Printf ( "done \ n")
 

export order

The next step is to export orders. When we have completed this command, our auxiliary plug it done.

When we remote warehouse push, Git issued this export command. After the adoption of the standard input stdin send this command, git will by provided by the git fast-export flows to track, but not with git fast-export corresponds to the remote repository can manipulate git fast-import command.

if test -n "$ GIT_REMOTE_TESTGIT_FAILURE"
then
# Consume input so fast-export does not get SIGPIPE;
# Git would also notice that case, but we want
# To make sure we are exercising the later
# Error checks
while read line; do
test "done" = "$ line" && break
done
exit1
fi
before = $ (git for-each-ref - format = '% (refname)% (objectname)')
git fast-import \
$ {Force: + - force} \
$ {Testgitmarks: + "- import-marks = $ testgitmarks"} \
$ {Testgitmarks: + "- export-marks = $ testgitmarks"} \
--quiet
# Figure out which refs were updated
git for-each-ref - format = '% (refname)% (objectname)' |
while read ref a
do
case "$ before" in
* "$ Ref $ a" *)
continue ;; # unchanged
esac
if test -z "$ GIT_REMOTE_TESTGIT_PUSH_ERROR"
then
echo "ok $ ref"
else
echo "error $ ref $ GIT_REMOTE_TESTGIT_PUSH_ERROR"
fi
done
echo
The first line of if statements, and the same as before, only for testing purposes only.

Next line more interesting. It creates a space-separated list, and this list is a $ refname $ objectname to decide which to express our ref to be updated in the import.

The next command is quite a explanatory. git fast-import our work received the standard input stream, - forece parameter indicates whether a specific, - quiet, and remote tag file marks.

Below this running again git for-each-ref refs to detect any changes. For each command returns a ref, it will detect whether $ refname $ objectname $ before the list appears in the inside. If it is, that nothing changes and continue to the next step. However, if the ref does not exist the $ before the list, it will be packaged output ok $ refname to inform git corresponding ref been successfully updated. If the printing error $ refname $ message is to inform git ref in the corresponding remote terminal failed to import.

Finally, print a blank line indicates that the import is completed.

Now we can write the code yourself. We can use our GitListRefs previously defined () method.

case command == "export \ n":
beforeRefs, err: = GitListRefs ()
if err! = nil {
return fmt.Errorf ( "command export: collecting before refs:% v", err)
}
cmd: = exec.Command ( "git", "fast-import", "- quiet",
"--import-Marks =" + gomarks,
"--export-Marks =" + gomarks)
cmd.Stderr = os.Stderr
cmd.Stdin = os.Stdin
if err: = cmd.Run (); err = nil {!
return fmt.Errorf ( "command export: git fast-import:% v", err)
}
afterRefs, err: = GitListRefs ()
if err! = nil {
return fmt.Errorf ( "command export: collecting after refs:% v", err)
}
for refname, objectname: = range afterRefs {
if beforeRefs [refname]! = objectname {
fmt.Printf ( "ok% s \ n", refname)
}
}
fmt.Printf ( "\ n")
 

Chopper small scale

Execution go install, should be able to build and install git-remote-go to go / bin.

You can verify this test: First, create two empty git repository and then commit a submission in testlocal in and through our new auxiliary plug helper to push it to testremote.

$ Cd $ HOME
$ Git init testremote
Initialized empty Git repository in $ HOME / testremote / .git /
$ Git init testlocal
Initialized empty Git repository in $ HOME / testlocal / .git /
$ Cd testlocal
$ Echo 'Hello, world!'> Hello.txt
$ Git add hello.txt
$ Git commit -m "First commit."
[Master (root-commit) 50d3a83] First commit.
1 file changed, 1 insertion (+)
create mode 100644 hello.txt
$ Git remote add origin go :: $ HOME / testremote
$ Git push --all origin
To go :: $ HOME / testremote
* [New branch] master -> master
$ Cd ../testremote
$ Git checkout master
$ Ls
hello.txt
$ Cat hello.txt
Hello, world!
 

Use git remote auxiliary plug

After implementing the interface, Git remote auxiliary plug-ins can be used for other source control (such as felipec / git-remote-hg), or push the code to CouchDBs (peritus / git-remote-couch), and so on others. You can also imagine more other possible uses.

For my initial motivation, I wrote a git remote auxiliary plug git-remote-grave. You can use it to push and fetch the file on your system or via HTTP / HTTPS protocol encrypted archive documents.

$ Git remote add usb grave :: / media / usb / backup.grave
$ Git push --all backup
There are two compression techniques, allowing the size of the archive file is usually reduced to the original 22%.

If you want a convenient place to store your encrypted git repository, you can visit the site I created: filegrave.com.

Part of this article is to discuss and exchange placed in the Hacker News and / r / programming.
     
         
       
         
  More:      
 
- Linux system security settings after installation (Linux)
- Oracle Execute to Parse perform analytical Ratio Analysis (Database)
- Simple to use Linux GPG (Linux)
- MySQL database to open a remote connection method (Database)
- Python function arguments * args and ** kwargs usage (Programming)
- Linux Getting Started tutorial: Experience KVM Virtual Machine chapter (Linux)
- TNS-03505 name could not be resolved (Database)
- Using the Android interface in Parcelable (Programming)
- Linux System Getting Started Tutorial: How to change the default Java version in Linux (Linux)
- Oracle view object space usage show_space (Database)
- grep regular expression (Linux)
- Red-black tree in C ++ (Programming)
- Iptables in Ubuntu (Linux)
- Boost - Memory Management - smart pointers (Programming)
- MySQL5.6 based GTID master-slave replication (Database)
- Ubuntu and derived versions of the user how to install G Mic 1.5.8.5 (Linux)
- To repair Shell script by default open in Ubuntu (Linux)
- How to set IonCube Loaders in Ubuntu (Linux)
- IO reference Docker container (Server)
- Linux can modify the maximum number of open files (Linux)
     
           
     
  CopyRight 2002-2016 newfreesoft.com, All Rights Reserved.