pip download [package] -d /tmp --no-binary :all: # extract all packages gunzip *.gz # extract tar tar xvf <filename>.tar # install package cd folder python setup.py install
sudo -H pip install --ignore-installed -U numpy pip install --ignore-installed six pip install --user (last one installs everything into /User//Library/Python/2.7/ path)
Ref: http://apple.stackexchange.com/questions/209572/how-to-use-pip-after-the-os-x-el-capitan-upgrade comment by Yuri
If you’re getting bombarded with brute force login attempts. Below will install DenyHosts as a daemon that will with default settings scan your /var/log/secure for failed login attempts. It is initially set to 5 failed attempts and then IP ends up in the hosts.deny file. You should get a good long look a the .cfg file to understand full capabilities. (For example running against Apache logs for web attacks)
wget http://downloads.sourceforge.net/project/denyhosts/denyhosts/2.6/DenyHosts-2.6.tar.gz tar -zxvf DenyHosts-2.6.tar.gz cd DenyHosts-2.6 python setup.py install cp /usr/share/denyhosts/daemon-control-dist /usr/share/denyhosts/daemon-control cp /usr/share/denyhosts/denyhosts.cfg-dist /usr/share/denyhosts/denyhosts.cfg ln -s /usr/share/denyhosts/daemon-control /etc/init.d/denyhosts chkconfig --add denyhosts service denyhosts start tail -f /etc/hosts.deny /var/log/secure
Using our open-source tools enable designers to automate boring production challenges, visualize large sets of data and access the raw power of the computer without thinking in ones and zeroes. Tools integrate with traditional design applications and run on many platforms.
Pattern is a web mining module for the Python programming language.
It bundles tools for data mining (Google + Twitter + Wikipedia API, web crawler, HTML DOM parser), natural language processing (part-of-speech taggers, n-gram search, sentiment analysis, WordNet), machine learning (vector space model, k-means, k-NN, SVM) and network analysis (graph centrality & visualization).
NLTK is a leading platform for building Python programs to work with human language data. It provides easy-to-use interfaces to over 50 corpora and lexical resources such as WordNet, along with a suite of text processing libraries for classification, tokenization, stemming, tagging, parsing, and semantic reasoning.