Start a search engine (Docker)
Start a smart contract search engine for your own DApp
This documentation details how you can start, and host, your own smart contract search engine using Docker. If you would like to build from scratch from a fresh Ubuntu install, please refer to this document.
Prerequisite
We start from a fresh install of Ubuntu 18.04. You should first follow the instructions here to install Docker.
Next install the Python pip and AWS CLI utilities as follows. The AWS CLI is required to access AWS ElasticSearch services.
$ sudo apt update
$ sudo apt install python-pip
$ sudo apt install awscliElasticSearch
We use the AWS ElasticSearch services to run the search engine. You should create a new ES cluster here. For now, a single machine development cluster would suffice. In the Access Policy section, please select IAM users. You will need an IAM user already set up to access AWS ES services. Here is an example.
arn:aws:iam::522901590065:user/secondstatesearchOnce the ElasticSearch service is up and running, you should have an ES endpoint like the following.
search-smart-contract-search-engine-3paomceha6u4qzchkmbgsjdcqa.us-east-1.es.amazonaws.comDocker
Now, go back to the Ubuntu 18.04 machine.
AWS credentials
Configure AWS CLI to access the ElasticSearch engine.
$ aws configureIt requires four pieces of information. The access keys are found in the IAM user console for the user you configured to access the ElasticSearch engine you just created.
AWS Access Key ID [None]: [IAM user console]
AWS Secret Access Key [None]: [IAM user console]
Default region name [None]: [Region for ES instance. eg us-east-1]
Default output format [None]: jsonAfter configuration, AWS config and credentials are placed in ~/.aws/.
Configure search engine
Next, get the source code for the search engine.
$ git clone https://github.com/second-state/smart-contract-search-engine.git
$ cd smart-contract-search-engineFill in the following configuration options.
ServerNamein apache configconfig/site.conf. This could be your public IP address for now.blockchain,elasticsearch, and the initial ABI configs inpython/config.ini.publicIpinjs/secondStateJS.js. This could be your IP address for now.Check here for details about configurations.
Build Docker image
$ docker build -f docker/Dockerfile -t search-engine .Run Docker container
$ docker run -d -it --rm -p 80:80 -v $HOME/.aws:/root/.aws search-engineNow you can visit http://<your_host> to check your smart contract search engine. Be patient, as it may take hours before the results show up on that page.
Upload more ABIs
Your search engine is started with a single ABI to index from the config.ini file. You can add more ABIs to the index by executing the following script from inside the Docker instance.
You can find the container_id for your docker instance on your host OS, by running
$ docker container lsNext, logging into your docker container using the container_id
$ docker exec -it container_id bashOnce logged, in the /app directory, create a file upload_abi.py like the following.
import re
import json
import time
import requests
from harvest import Harvest
harvester = Harvest()
abiUrl1 = "http://A_raw_text_file_which_contains_only_an_abi's_text"
abiData1 = requests.get(abiUrl1).content
abiData1JSON = json.loads(abiData1)
theDeterministicHash1 = harvester.shaAnAbi(abiData1JSON)
cleanedAndOrderedAbiText1 = harvester.cleanAndConvertAbiToText(abiData1JSON)
data1 = {}
data1['indexInProgress'] = "false"
data1['epochOfLastUpdate'] = int(time.time())
data1['abi'] = cleanedAndOrderedAbiText1
harvester.es.index(index=harvester.abiIndex, id=theDeterministicHash1, body=data1)Then run
$ python3.6 upload_abi.pyAlso once all of this is done, please just exit docker and give it a reboot.
$ docker restart container_idLast updated
Was this helpful?