This repository was archived by the owner on Jun 13, 2025. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 1
Administration Documentation
Sivasurya Santhanam edited this page Jun 21, 2018
·
9 revisions
Table of Contents
These instructions help to install the KnowledgeFinder - Dataimporter on a server.
See Setup for the install instructions.
|
Solr Security First and foremost, Solr does not concern itself with security either at the document level or the communication level. It is strongly recommended that the application server containing Solr be firewalled such the only clients with access to Solr are your own. A default/example installation of Solr allows any client with access to it to add, update, and delete documents (and of course search/read too), including access to the Solr configuration and schema files and the administrative user interface. Besides limiting port access to the Solr server, standard Java web security can be added by tuning the container and the Solr web application configuration itself via web.xml. For example, all /update URLs could require HTTP authentication. If there is a need to provide query access to a Solr server from the open internet, it is highly recommended to use a proxy, such as one of these. |
Solr is shipped with an init.d service installation script.
ℹ️
|
For further options and explanations see Taking Solr to Production from the Apache Solr online documentation |
Extract install_solr_service.sh from the solr distribution archive (in this case solr-7.0.0.tgz) and run it:
tar xzf solr-7.0.0.tgz solr-7.0.0/bin/install_solr_service.sh --strip-components=2 sudo bash ./install_solr_service.sh solr-7.0.0.tgz -d /path/to/UNZIPED_SOLR_INDEX_FOLDER/PROJECT_NAME -n
This script will automatically extract the distribution archive to opt/.
You can use the installed service with:
sudo service solr status | start | stop | restart
Adapt the following service configuration to your system and save it under /etc/systemd/system/solr.service.
You can activate the script with
systemctl enable solr
and after that start, stop etc it with
systemctl status | start | stop solr
systemd service file
[Unit]
Description=Description=Starts and stops the Solr.
After=network.target
[Service]
Environment="SOLR_HOME=/PATH/TO/SOLR.XML"
ExecStart=/opt/solr/bin/solr start -s ${SOLR_HOME} -p 8983
ExecStop=/opt/solr/bin/solr stop -p 8983
User=solr
Group=solr
RemainAfterExit=yes
[Install]
WantedBy=multi-user.target
The CONFIGURATION-NAME
is the name of the maven build, i.e.
config.example-1.0.0-SNAPSHOT.tar.gz
#!/bin/bash
function log() {
echo "`date`: $1";
}
function check() {
if [ $? -ne 0 ]; then
log $1;
exit 0;
fi;
}
# check if there is an update in the data
log "Checking SVN status";
REMOTE_REPO="https://REMOTE-SVN-HOST/SVN/PATH"
LOCAL_REPO="/local/svn/path"
USER_REPO="USER_NAME"
PASSW_REPO="****"
REMOTE_REV=`svn info --username $USER_REPO --password $PASSW_REPO $REMOTE_REPO --no-auth-cache | grep '^Revision:' | awk '{print $2}'`
LOCAL_REV=`svn info $LOCAL_REPO | grep '^Revision:' | awk '{print $2}'`
#if [ $REMOTE_REV -eq $LOCAL_REV ]; then
# log "No updated data found. Done.";
#else
TODAY=$(date +"%Y-%m-%d-%H-%M-%S");
BACKUP="/srv/PROJECT_NAME/solrIndex/backup";
# update data
#log "Updating SVN";
#svn update --username $USER_REPO --password $PASSW_REPO $LOCAL_REPO --no-auth-cache;
#check "ERROR: SVN update failed";
# archive current index
log "Backup current index";
mkdir $BACKUP/$TODAY;
cp /srv/PROJECT_NAME/solrIndex/CONFIGURATION-NAME/PROJECT_NAME/data/index/* $BACKUP/$TODAY/;
check "ERROR: Backup failed";
# remove old backups
find $BACKUP -maxdepth 1 -type d -printf '%T@ %i\n' | sort -n | head -n -6 | while read tstamp inode
do
find $BACKUP -inum "$inode" -exec rm -rf {} \;
done
# build new index
log "Building new index";
curl http://localhost:8983/solr/PROJECT_NAME/dataimport?command=full-import
# todo check if failed;
log "Done.";
#fi;
#EOF