Starting from Solr 9, Solr does not contain anymore the Data Import Handler (DIH) package anymore. Furthermore, as of July 2022, no one really committed to maintain and update it regularly. Yet it had a large users base, who is now looking for alternatives.
One of the main goals of DIH was to index databases easily into Solr.
In this page, we explain how to do it with Datafari Community Edition, the open source version of Datafari Enterprise Search. Thanks to several years of work, Datafari integrates seamlessly much more than what you will need to replace DIH, but we focus here on how to just replace DIH for a scenario with Solr. Out of the box and focusing on our DIH replacement scenario, Datafari 5.2 contains :
Apache ManifoldCF which is a framework of connectors (ManifoldCF is coded in Java and the output connection to Solr is configured out of the box into Datafari in SolrCloud mode so it is a smart client to index very quickly millions of documents into Solr)
Solr 8 to index data crawled from MCF
and the webapp Datafari with a user interface to search into the data (as a bonus, since it is not needed in the DIH replacement scenario)
There are three main steps :
Download and install Datafari
(optional - not needed for PostgreSQL) Add the JDBC driver that corresponds to your database (we do not have the right to include it in Datafari due to licence issues for MariaDB or MySQL for example) unless you are crawling a PostgreSQL database, in which case the JDBC driver is already included
NB : The next version of Datafari (5.3) will include by defaut the JDBC driver for Microsoft SQL server and Oracle server.Create your crawl job
Right after these steps, you can begin to search into your data !
Let’s explain these steps :
For this tutorial, we use two instances on Scaleway :
GPS1 8x86 64 bit 32 Go RAM, 300 Go NVME with Debian 11
IP Datafari server : 51.158.69.126
IP MySQL server : 163.172.184.196
We will install Datafari on one instance, and a MySQL server 8 on the other one. Note that the installation of MySQL and the ingestion of a dataset into it are out of scope of this tutorial, and as such they are detailed in an annex.
1. Download and install Datafari
Connect into SSH to your server
Download latest stable version of Datafari :
wget https://www.datafari.com/files/debian/datafari.deb
To install the dependencies of Datafari, download our convenient script to install them automatically :
wget https://www.datafari.com/files/scripts_init_datafari/init_server_datafari_5_debian_10_plus.sh
Now execute the init script :
source init_server_datafari_5_debian_10_plus.sh
Install Datafari :
dpkg -i datafari.deb
init Datafari
cd /opt/datafari/bin bash init-datafari.sh
And Voilà! Datafari is installed and functional. You can connect to https://$IP_OF_YOUR_DATAFARI_SERVER/datafariui
In our example : https://51.158.69.126/datafariui
For more information see this page : Install Datafari - Community Edition
2. Install the JDBC driver into Datafari
In our example, we want to crawl a database into MySQL version 8.0.29.
We download the JDBC driver from Maven repository. The latest stable version is MySQL connector/J 8.0.29 : https://repo1.maven.org/maven2/mysql/mysql-connector-java/8.0.29/mysql-connector-java-8.0.29.jar
cd /opt/datafari/mcf/mcf_home/connector-lib-proprietary wget https://repo1.maven.org/maven2/mysql/mysql-connector-java/8.0.29/mysql-connector-java-8.0.29.jar chmod 775 /opt/datafari/mcf/mcf_home/connector-lib-proprietary/mysql* chown datafari /opt/datafari/mcf/mcf_home/connector-lib-proprietary/mysql* cp /opt/datafari/mcf/mcf_home/connector-lib-proprietary/mysql* /opt/datafari/tomcat-mcf/lib/ chmod 775 /opt/datafari/tomcat-mcf/lib/mysql* chown datafari /opt/datafari/tomcat-mcf/lib/mysql*
Edit the file /opt/datafari/mcf/mcf_home/options.env.unix
nano /opt/datafari/mcf/mcf_home/options.env.unix
Add the path to the new lib in the -cp parameter line :
connector-lib-proprietary/mysql-connector-java-8.0.29.jar
The entire file (into Datafari 5.2) is this one :
-Xms512m -Xmx512m -Dorg.apache.manifoldcf.configfile=./properties.xml -Duser.timezone=UTC -Djute.maxbuffer=10000000 -Dorg.apache.manifoldcf.processid=A -Djava.io.tmpdir=/opt/datafari/tmp -cp .:lib/commons-codec-1.10.jar:connector-lib-proprietary/mysql-connector-java-8.0.29.jar:lib/commons-collections-3.2.2.jar:lib/mariadb-java-client-3.0.6.jar:/lib/commons-collections4-4.4.jar:lib/commons-el-1.0.jar:lib/commons-exec-1.3.jar:lib/commons-fileupload-1.3.3.jar:lib/commons-io-2.11.0.jar:lib/commons-lang-2.6.jar:lib/commons-lang3-3.12.0.jar:lib/commons-logging-1.2.jar:lib/ecj-4.3.1.jar:lib/gson-2.8.0.jar:lib/guava-25.1-jre.jar:lib/hsqldb-2.3.2.jar:lib/httpclient-4.5.13.jar:lib/httpcore-4.4.15.jar:lib/jasper-6.0.35.jar:lib/jasper-el-6.0.35.jar:lib/javax.activation-1.2.0.jar:lib/javax.activation-api-1.2.0.jar:lib/javax.mail-1.6.2.jar:lib/javax.servlet-api-3.1.0.jar:lib/jetty-client-9.4.25.v20191220.jar:lib/jetty-continuation-9.4.25.v20191220.jar:lib/jetty-http-9.4.25.v20191220.jar:lib/jetty-io-9.4.25.v20191220.jar:lib/jetty-jndi-9.4.25.v20191220.jar:lib/jetty-jsp-9.2.30.v20200428.jar:lib/jetty-jsp-jdt-2.3.3.jar:lib/jetty-plus-9.4.25.v20191220.jar:lib/jetty-schemas-3.1.M0.jar:lib/jetty-security-9.4.25.v20191220.jar:lib/jetty-server-9.4.25.v20191220.jar:lib/jetty-servlet-9.4.25.v20191220.jar:lib/jetty-util-9.4.25.v20191220.jar:lib/jetty-webapp-9.4.25.v20191220.jar:lib/jetty-xml-9.4.25.v20191220.jar:lib/jna-5.10.0.jar:lib/jna-platform-5.10.0.jar:lib/json-simple-1.1.1.jar:lib/jsp-api-2.1-glassfish-2.1.v20091210.jar:lib/juli-6.0.35.jar:lib/log4j-1.2-api-2.17.1.jar:lib/log4j-api-2.17.1.jar:lib/log4j-core-2.17.1.jar:lib/log4j-jcl-2.17.1.jar:lib/log4j-jul-2.17.1.jar:lib/log4j-slf4j-impl-2.17.1.jar:lib/mcf-agents.jar:lib/mcf-core.jar:lib/mcf-jetty-runner.jar:lib/mcf-pull-agent.jar:lib/mcf-script-engine.jar:lib/mcf-ui-core.jar:lib/postgresql-42.2.5.jar:lib/serializer-2.7.1.jar:lib/slf4j-api-1.7.36.jar:lib/velocity-engine-core-2.3.jar:lib/xalan-2.7.1.jar:lib/xercesImpl-2.10.0.jar:lib/xml-apis-1.4.01.jar:lib/zookeeper-3.4.10.jar
For more information, see this page : Connector - Add a JDBC connector (MySQL, Oracle)
Restart Datafari - You have 2 options to do it :
Option 1 - Via the Datafari admin UI : Go to the main server of Datafari, then click on Services Administration and Restart.
Option 2 - By restarting Datafari via SSH :
cd /opt/datafari/bin bash stop-datafari.sh bash start-datafari.sh
For more information about restarting Datafari, see this page : /wiki/spaces/DATAFARI/pages/111903130
The final step is to configure the crawling job.
3. Configure the job
Go to the admin Datafari UI then go to Connectors → Data Crawler Simplified mode and select Create Database job.
You will need to fill in all the mandatory parameters to configure the job.
To get some information about the different fields see this page : Database crawling simplified configuration
In our example, we crawl a dump of Wikipedia pages.
This is the structure :
And some data :
The configuration is this one :
Database type : MySQL
Database host : 163.172.184.196
Database name : wiki
user : root
Password : admin
Seeding query :
SELECT page_id AS $(IDCOLUMN) FROM page
Version query
SELECT page_id AS $(IDCOLUMN), page_id AS $(VERSIONCOLUMN) FROM page WHERE page_id IN $(IDLIST)
Data query
SELECT page_id AS $(IDCOLUMN), page_id AS $(URLCOLUMN), page_title AS $(DATACOLUMN) FROM page WHERE page_id IN $(IDLIST)
Source name : db
Repository name : msqylrepo
Start he job once created : check the box
Finally click on Save button
Then you can check if all is ok in your MCF : https://$IP_OF_YOUR_DATAFARI_SERVER/datafari-mcf-crawler-ui/
In our example : https://51.158.69.126/datafari-mcf-crawler-ui/
Depending on the number of rows in your database, the job will be first in ‘starting up’ mode to get all the ids of the documents to crawl. Only then will the “real” act of crawling start, and the job will be in “running” mode.
If you need to modify your job, simply edit it then relaunch the job.
Finally go to Datafari and search your data (optional, you do not need DatafariUI, you can do as you were doing before when you were combining DIH and Solr) :
Go to https://$IP_OF_DATAFARI_SERVER/datafariui
In our example : https://51.158.69.126/datafariui
You have a complete search solution in alternative to DIH, based on Apache ManifoldCF and Apache Solr.
Annexes
We detail here the steps we did to install MySQL server into the Scaleway instance.
Connect into SSH into the instance
Get MySQL Server 8 :
apt update wget https://dev.mysql.com/get/mysql-apt-config_0.8.22-1_all.deb apt install ./mysql-apt-config_0.8.22-1_all.deb apt update apt install mysql-server
Check if MySQL is well started :
service mysql status
Get the SQL dump of English Wikipedia pages :
wget http://dumps.wikimedia.org/enwiki/latest/enwiki-latest-page.sql.gz
Uncompress it :
gzip -d enwiki-latest-page.sql.gz
Create the database and change encoding :
mysql -uroot -p CREATE DATABASE wiki; USE wiki; ALTER DATABASE wiki CHARACTER SET = utf8mb4 COLLATE = utf8mb4_unicode_ci;
Import the data into the database :
mysql -u root -p wiki_en < enwiki-latest-page.sql
Change the configuration into MySQL to allow remote connection, to do so edit the file /etc/mysql/mysql.conf.d/mysqld.cnf :
nano /etc/mysql/mysql.conf.d/mysqld.cnf
And add this line at the end of the file :
bind-address = 0.0.0.0
ie here we allow any IP to connect to the MySQL database, it is just for demo purpose here !
Create a new user into the database :
mysql -u root -p CREATE USER 'datafari'@'51.158.69.126' IDENTIFIED BY 'admin'; GRANT ALL PRIVILEGES ON * . * TO 'datafari'@'51.158.69.126';
In this example, the name of the user is datafari and the password is admin. We allow datafari user to connect to MySQL database from the location of our Datafari server : 51.158.69.126. We granted all privileges to datafari user, once again it is just for demo purpose.