diff --git a/README.md b/README.md index e5e6468a..8c647be7 100644 --- a/README.md +++ b/README.md @@ -2,7 +2,8 @@ **⚠️ This service is currently in /v0. Breaking changes are coming in /v1 (also possible, but not likely, within /v0 releases). Please use with caution.** -This is an API aimed at retrieving data from blockchain quickly and conveniently. We support public APIs for: +This is an API aimed at retrieving data from blockchain quickly and conveniently. We support public APIs for: + - Mainnet - [https://api.wavesplatform.com/v0/](https://api.wavesplatform.com/v0/) - Testnet @@ -10,7 +11,6 @@ This is an API aimed at retrieving data from blockchain quickly and conveniently Visit `/docs` for Swagger documentation. - ## Data service on-premise It is possible to create your own instance of this service. To do so, follow the guide below. @@ -19,26 +19,26 @@ It is possible to create your own instance of this service. To do so, follow the 1. PostgreSQL 11 database with a table stricture found in [wavesplatform/blockchain-postgres-sync](https://github.com/wavesplatform/blockchain-postgres-sync) 2. Downloaded and continuously updated blockchain data in the database -2. NodeJS or Docker for either running the service directly, or in a container +3. NodeJS or Docker for either running the service directly, or in a container #### Installation and start The service uses following environment variables: -|Env variable|Default|Required|Description| -|------------|-------|--------|-----------| -|`PORT`|3000|NO|HTTP service port| -|`PGHOST`||YES|Postgres host address| -|`PGPORT`|`5432`|NO|Postgres port| -|`PGDATABASE`||YES|Postgres database name| -|`PGUSER`||YES|Postgres user name| -|`PGPASSWORD`||YES|Postgres password| -|`PGPOOLSIZE`|`20`|NO|Postgres pool size| -|`PGSTATEMENTTIMEOUT`|false|NO|Postgres `statement_timeout` number in ms. 0 disables timeout, false — use server settings; at this moment used only as default `STATEMENT_TIMEOUT`| -|`LOG_LEVEL`|`info`|NO|Log level `['info','warn','error']`| -|`DEFAULT_MATCHER`||YES|Default matcher public address| -|`MATCHER_SETTINGS_URL`||NO|Default matcher URL for getting settings| -|`DEFAULT_TIMEOUT`|30000|NO|Default timeout in ms; at this moment used only as `PG STATEMENT_TIMEOUT`| +| Env variable | Default | Required | Description | +| ---------------------- | ------- | -------- | --------------------------------------------------------------------------------------------------------------------------------------------------- | +| `PORT` | 3000 | NO | HTTP service port | +| `PGHOST` | | YES | Postgres host address | +| `PGPORT` | `5432` | NO | Postgres port | +| `PGDATABASE` | | YES | Postgres database name | +| `PGUSER` | | YES | Postgres user name | +| `PGPASSWORD` | | YES | Postgres password | +| `PGPOOLSIZE` | `20` | NO | Postgres pool size | +| `PGSTATEMENTTIMEOUT` | false | NO | Postgres `statement_timeout` number in ms. 0 disables timeout, false — use server settings; at this moment used only as default `STATEMENT_TIMEOUT` | +| `LOG_LEVEL` | `info` | NO | Log level `['info','warn','error']` | +| `DEFAULT_MATCHER` | | YES | Default matcher public address | +| `MATCHER_SETTINGS_URL` | | NO | Default matcher URL for getting settings | +| `DEFAULT_TIMEOUT` | 30000 | NO | Default timeout in ms; at this moment used only as `PG STATEMENT_TIMEOUT` | `PGPOOLSIZE` is used by the `pg-pool` library to determine Postgres connection pool size per NodeJS process instance. A good value depends on your server and db configuration and can be found empirically. You can leave it at the default value to start with. @@ -47,23 +47,27 @@ Set those variables to a `variables.env` file in the root of the project for con If you would like to use some other way of setting environment variables, just replace relevant commands below with custom alternatives. ##### Docker + If you wish to build data-service image locally, run this command from the project root - ```bash - docker build -t wavesplatform/data-service . - ``` + +```bash +docker build -t wavesplatform/data-service . +``` Otherwise you can use our public image from https://hub.docker.com/r/wavesplatform/data-service Run the container using this command: - ```bash - docker run -p=:3000 --env-file=variables.env wavesplatform/data-service - ``` + +```bash +docker run -p=:3000 --env-file=variables.env wavesplatform/data-service +``` A server will start at `localhost:` (used in the `docker run` command). Logs will be handled by Docker. Use any other Docker options if necessary. When using the container in production, we recommend establishing a Docker logging and restart policy. ##### NodeJS + 1. Install dependencies ```bash npm install # or `yarn install`, if you prefer @@ -81,28 +85,30 @@ Server will start at `localhost:PORT` (defaults to 3000). Logs will be directed If you decide to use NodeJS directly (without Docker), we recommend using a process manager, such as `pm2`. - #### Daemons -To add candles and pairs functionality the following Docker daemons must be used: -- Candles — calculate candles for exchange transactions (see [description](https://hub.docker.com/r/wavesplatform/data-service-candles/)) -- Pairs — calculate last pairs for 24h exchange transactions (see [description](https://hub.docker.com/r/wavesplatform/data-service-pairs/)) +To add and pairs functionality the following Docker daemons must be used: + +- Pairs — calculate last pairs for 24h exchange transactions (see [description](https://hub.docker.com/r/wavesplatform/data-service-pairs/)) #### Documentation + You can run your own instance of Swagger online documentation. To do this, you have to: + 1. Build Docker image from docs/ directory: - ```bash - docker build -t wavesplatform/data-service-docs docs/ - ``` + ```bash + docker build -t wavesplatform/data-service-docs docs/ + ``` 2. Run the container - ```bash - docker run --rm -d -p 8080:8080 -e SWAGGER_JSON=/app/openapi.json wavesplatform/data-service-docs - ``` + ```bash + docker run --rm -d -p 8080:8080 -e SWAGGER_JSON=/app/openapi.json wavesplatform/data-service-docs + ``` Its will start the documentation server at `localhost:8080`. Enjoy! #### General recommendations + - Set up a dedicated web server such as Nginx in front of data-service backends (for ssl/caching/balancing); - Implement a caching strategy. Different endpoints may need different cache time (or no cache at all); - Run several process instances behind a load balancer per machine. `docker-compose --scale` can help with that, or it can be done manually. A good rule of thumb is to use as many instances as CPU cores available; diff --git a/docs/openapi.json b/docs/openapi.json index fc618076..22dfb92f 100644 --- a/docs/openapi.json +++ b/docs/openapi.json @@ -4703,7 +4703,7 @@ "tags": [ "transactions" ], - "summary": "Get a Invoke-script transaction info by id", + "summary": "Get an Invoke-script transaction info by id", "operationId": "getTxsInvokeScript", "parameters": [ { @@ -4964,6 +4964,518 @@ } } }, + "/transactions/update-asset-info/{id}": { + "get": { + "tags": [ + "transactions" + ], + "summary": "Get an Update asset info transaction by id", + "operationId": "getTxsUpdateAssetInfo", + "parameters": [ + { + "name": "id", + "in": "path", + "description": "transaction ID", + "required": true, + "schema": { + "type": "string" + } + } + ], + "responses": { + "200": { + "description": "Update asset info transaction", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/TxUpdateAssetInfo" + } + } + } + }, + "404": { + "description": "Transaction not found" + } + } + } + }, + "/transactions/update-asset-info": { + "get": { + "tags": [ + "transactions" + ], + "summary": "Get a list of Update asset info transactions by applying filters", + "operationId": "searchTxsUpdateAssetInfo", + "parameters": [ + { + "in": "query", + "name": "ids", + "description": "Transaction IDs array", + "required": false, + "schema": { + "type": "array", + "items": { + "type": "string" + } + } + }, + { + "in": "query", + "name": "sender", + "description": "Address-sender of the transaction; has exclusive relation with senders", + "schema": { + "type": "string" + } + }, + { + "in": "query", + "name": "senders", + "description": "Array of address-senders of the transaction; has exclusive relation with sender", + "schema": { + "type": "array", + "items": { + "type": "string" + } + } + }, + { + "in": "query", + "name": "assetId", + "description": "Filter transactions by assetId", + "schema": { + "type": "string" + } + }, + { + "in": "query", + "name": "timeStart", + "description": "Time range filter, start. Defaults to first transaction's time_stamp in db. (ISO-8601 or timestamp in milliseconds, UTC)", + "schema": { + "oneOf": [ + { + "type": "string", + "format": "date-time" + }, + { + "type": "number" + } + ] + } + }, + { + "in": "query", + "name": "timeEnd", + "description": "Time range filter, end. Defaults to now. (ISO-8601 or timestamp in milliseconds, UTC)", + "schema": { + "oneOf": [ + { + "type": "string", + "format": "date-time" + }, + { + "type": "number" + } + ] + } + }, + { + "in": "query", + "name": "after", + "description": "Cursor in base64 encoding. Holds information about timestamp, id, sort.", + "schema": { + "type": "string" + } + }, + { + "in": "query", + "name": "sort", + "description": "Sort order. Gonna be rewritten by cursor's sort if present.", + "schema": { + "type": "string", + "enum": [ + "asc", + "desc" + ], + "default": "desc" + } + }, + { + "in": "query", + "name": "limit", + "description": "How many transactions to await in response.", + "schema": { + "type": "integer", + "minimum": 1, + "maximum": 100, + "default": 100 + } + } + ], + "responses": { + "200": { + "description": "List of UpdateAssetInfo transactions satisfying provided filters", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ListOfTxUpdateAssetInfo" + } + } + } + } + } + }, + "post": { + "tags": [ + "transactions" + ], + "summary": "Get a list of Update asset info transactions by applying filters", + "operationId": "postSearchTxsUpdateAssetInfo", + "requestBody": { + "required": false, + "content": { + "application/json": { + "schema": { + "type": "object", + "properties": { + "ids": { + "type": "array", + "description": "Transaction IDs array", + "items": { + "type": "string" + } + }, + "sender": { + "type": "string", + "description": "Address-sender of the transaction" + }, + "timeStart": { + "oneOf": [ + { + "type": "string", + "format": "date-time" + }, + { + "type": "number" + } + ], + "description": "Time range filter, start. Defaults to first transaction's time_stamp in db. (ISO-8601 or timestamp in milliseconds, UTC)", + "example": "2019-01-01T00:00:00.000" + }, + "timeEnd": { + "oneOf": [ + { + "type": "string", + "format": "date-time" + }, + { + "type": "number" + } + ], + "description": "Time range filter, end. Defaults to now. (ISO-8601 or timestamp in milliseconds, UTC)", + "example": "2020-01-01T00:00:00.000" + }, + "after": { + "type": "string", + "description": "Cursor in base64 encoding. Holds information about timestamp, id, sort." + }, + "sort": { + "type": "string", + "description": "Sort order. Gonna be rewritten by cursor's sort if present.", + "enum": [ + "asc", + "desc" + ], + "default": "desc" + }, + "limit": { + "type": "integer", + "description": "How many transactions to await in response.", + "minimum": 1, + "maximum": 100, + "default": 100, + "example": 100 + } + } + } + } + } + }, + "responses": { + "200": { + "description": "List of UpdateAssetInfo transactions satisfying provided filters", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ListOfTxUpdateAssetInfo" + } + } + } + } + } + } + }, + "/transactions/ethereum-like/{id}": { + "get": { + "tags": [ + "transactions" + ], + "summary": "Get an Ethereum-like info transaction info by id", + "operationId": "getTxsEthereumLike", + "parameters": [ + { + "name": "id", + "in": "path", + "description": "transaction ID", + "required": true, + "schema": { + "type": "string" + } + } + ], + "responses": { + "200": { + "description": "Ethereum-like transaction", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/TxEthereumLike" + } + } + } + }, + "404": { + "description": "Transaction not found" + } + } + } + }, + "/transactions/ethereum-like": { + "get": { + "tags": [ + "transactions" + ], + "summary": "Get a list of Ethereum-like transactions by applying filters", + "operationId": "searchTxsEthereumLike", + "parameters": [ + { + "in": "query", + "name": "ids", + "description": "Transaction IDs array", + "required": false, + "schema": { + "type": "array", + "items": { + "type": "string" + } + } + }, + { + "in": "query", + "name": "sender", + "description": "Address-sender of the transaction; has exclusive relation with senders", + "schema": { + "type": "string" + } + }, + { + "in": "query", + "name": "senders", + "description": "Array of address-senders of the transaction; has exclusive relation with sender", + "schema": { + "type": "array", + "items": { + "type": "string" + } + } + }, + { + "in": "query", + "name": "timeStart", + "description": "Time range filter, start. Defaults to first transaction's time_stamp in db. (ISO-8601 or timestamp in milliseconds, UTC)", + "schema": { + "oneOf": [ + { + "type": "string", + "format": "date-time" + }, + { + "type": "number" + } + ] + } + }, + { + "in": "query", + "name": "timeEnd", + "description": "Time range filter, end. Defaults to now. (ISO-8601 or timestamp in milliseconds, UTC)", + "schema": { + "oneOf": [ + { + "type": "string", + "format": "date-time" + }, + { + "type": "number" + } + ] + } + }, + { + "in": "query", + "name": "type", + "description": "Transaction type.", + "schema": { + "type": "string", + "enum": [ + "invocation", + "transfer" + ], + } + }, + { + "in": "query", + "name": "function", + "description": "Search transactions by the called function name. Only for `invocation` type transactions.", + "schema": { + "type": "string" + } + }, + { + "in": "query", + "name": "after", + "description": "Cursor in base64 encoding. Holds information about timestamp, id, sort.", + "schema": { + "type": "string" + } + }, + { + "in": "query", + "name": "sort", + "description": "Sort order. Gonna be rewritten by cursor's sort if present.", + "schema": { + "type": "string", + "enum": [ + "asc", + "desc" + ], + "default": "desc" + } + }, + { + "in": "query", + "name": "limit", + "description": "How many transactions to await in response.", + "schema": { + "type": "integer", + "minimum": 1, + "maximum": 100, + "default": 100 + } + } + ], + "responses": { + "200": { + "description": "List of EthereumLike transactions satisfying provided filters", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ListOfTxEthereumLike" + } + } + } + } + } + }, + "post": { + "tags": [ + "transactions" + ], + "summary": "Get a list of Ethereum-like transactions by applying filters", + "operationId": "postSearchTxsEthereumLike", + "requestBody": { + "required": false, + "content": { + "application/json": { + "schema": { + "type": "object", + "properties": { + "ids": { + "type": "array", + "description": "Transaction IDs array", + "items": { + "type": "string" + } + }, + "sender": { + "type": "string", + "description": "Address-sender of the transaction" + }, + "timeStart": { + "oneOf": [ + { + "type": "string", + "format": "date-time" + }, + { + "type": "number" + } + ], + "description": "Time range filter, start. Defaults to first transaction's time_stamp in db. (ISO-8601 or timestamp in milliseconds, UTC)", + "example": "2019-01-01T00:00:00.000" + }, + "timeEnd": { + "oneOf": [ + { + "type": "string", + "format": "date-time" + }, + { + "type": "number" + } + ], + "description": "Time range filter, end. Defaults to now. (ISO-8601 or timestamp in milliseconds, UTC)", + "example": "2020-01-01T00:00:00.000" + }, + "after": { + "type": "string", + "description": "Cursor in base64 encoding. Holds information about timestamp, id, sort." + }, + "sort": { + "type": "string", + "description": "Sort order. Gonna be rewritten by cursor's sort if present.", + "enum": [ + "asc", + "desc" + ], + "default": "desc" + }, + "limit": { + "type": "integer", + "description": "How many transactions to await in response.", + "minimum": 1, + "maximum": 100, + "default": 100, + "example": 100 + } + } + } + } + } + }, + "responses": { + "200": { + "description": "List of UpdateAssetInfo transactions satisfying provided filters", + "content": { + "application/json": { + "schema": { + "$ref": "#/components/schemas/ListOfTxUpdateAssetInfo" + } + } + } + } + } + } + }, "/aliases/{alias}": { "get": { "tags": [ @@ -5829,6 +6341,9 @@ "signature": { "type": "string" }, + "applicationStatus": { + "type": "string" + }, "order1": { "$ref": "#/components/schemas/Order" }, @@ -5924,6 +6439,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "data": { "type": "array", "items": { @@ -6024,6 +6542,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "assetId": { "type": "string" }, @@ -6096,6 +6617,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "recipient": { "type": "string" }, @@ -6157,6 +6681,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "recipient": { "type": "string" }, @@ -6211,6 +6738,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "leaseId": { "type": "string" } @@ -6263,6 +6793,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "script": { "type": "string" } @@ -6315,6 +6848,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "recipient": { "type": "string" }, @@ -6370,6 +6906,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "assetId": { "type": "string" }, @@ -6440,6 +6979,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "assetId": { "type": "string" }, @@ -6498,6 +7040,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "assetId": { "type": "string" }, @@ -6553,6 +7098,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "assetId": { "type": "string" }, @@ -6608,6 +7156,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "alias": { "type": "string" } @@ -6657,6 +7208,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "recipient": { "type": "string" }, @@ -6712,6 +7266,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "assetId": { "type": "string" }, @@ -6767,6 +7324,9 @@ "type": "string" } }, + "applicationStatus": { + "type": "string" + }, "dApp": { "type": "string" }, @@ -6838,6 +7398,135 @@ } } }, + "TxUpdateAssetInfo": { + "type": "object", + "properties": { + "__type": { + "type": "string", + "example": "transaction" + }, + "data": { + "type": "object", + "properties": { + "height": { + "type": "integer", + "format": "int32" + }, + "type": { + "type": "integer" + }, + "id": { + "type": "string" + }, + "timestamp": { + "type": "string", + "format": "date-time" + }, + "signature": { + "type": "string" + }, + "proofs": { + "type": "array", + "items": { + "type": "string" + } + }, + "version": { + "type": "integer" + }, + "fee": { + "type": "number" + }, + "applicationStatus": { + "type": "string" + }, + "sender": { + "type": "string" + }, + "senderPublicKey": { + "type": "string" + }, + "assetId": { + "type": "string" + }, + "name": { + "type": "string" + }, + "description": { + "type": "string" + } + } + } + } + }, + "TxEthereumLike": { + "type": "object", + "properties": { + "__type": { + "type": "string", + "example": "transaction" + }, + "data": { + "type": "object", + "properties": { + "height": { + "type": "integer", + "format": "int32" + }, + "type": { + "type": "integer" + }, + "id": { + "type": "string" + }, + "timestamp": { + "type": "string", + "format": "date-time" + }, + "proofs": { + "type": "array", + "items": { + "type": "string" + } + }, + "version": { + "type": "integer" + }, + "fee": { + "type": "number" + }, + "applicationStatus": { + "type": "string" + }, + "sender": { + "type": "string" + }, + "senderPublicKey": { + "type": "string" + }, + "bytes": { + "type": "string", + }, + "payload": { + "type": "object", + "properties": { + "type": { + "type": "string" + }, + "call": { + "type": "object", + "properties": { + "function": { + "type": "string" + } + } + } + } + } + } + } + } + }, "Rate": { "type": "object", "required": [ @@ -7456,6 +8145,58 @@ "type": "boolean" } } + }, + "ListOfTxUpdateAssetInfo": { + "type": "object", + "required": [ + "__type", + "data" + ], + "properties": { + "__type": { + "type": "string", + "example": "list" + }, + "data": { + "type": "array", + "items": { + "$ref": "#/components/schemas/TxUpdateAssetInfo" + } + }, + "lastCursor": { + "type": "string", + "example": "U2F0IEp1biAwMiAyMDE4IDE0OjAxOjQ1IEdNVCswMzAwIChNU0spOjozUHVrdnJzN2FwN1ZmQVlyTHlpZlNvZ0xYM0NuTVV4c0VFbmhuNk40WUVHWjo6YXNj" + }, + "isLastPage": { + "type": "boolean" + } + } + }, + "ListOfTxEthereumLike": { + "type": "object", + "required": [ + "__type", + "data" + ], + "properties": { + "__type": { + "type": "string", + "example": "list" + }, + "data": { + "type": "array", + "items": { + "$ref": "#/components/schemas/TxEthereumLike" + } + }, + "lastCursor": { + "type": "string", + "example": "U2F0IEp1biAwMiAyMDE4IDE0OjAxOjQ1IEdNVCswMzAwIChNU0spOjozUHVrdnJzN2FwN1ZmQVlyTHlpZlNvZ0xYM0NuTVV4c0VFbmhuNk40WUVHWjo6YXNj" + }, + "isLastPage": { + "type": "boolean" + } + } } } } diff --git a/package-lock.json b/package-lock.json index 8c12cca2..9122a0e2 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,6 +1,6 @@ { "name": "data-service", - "version": "0.36.0", + "version": "0.37.0", "lockfileVersion": 1, "requires": true, "dependencies": { diff --git a/package.json b/package.json index 1abbcef6..dd5d5e72 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "data-service", - "version": "0.36.0", + "version": "0.37.0", "description": "Waves data service", "main": "src/index.js", "repository": "git@github.com:wavesplatform/data-service.git", @@ -10,7 +10,6 @@ "build": "rm -rf dist/ && tsc", "start": "NODE_PG_FORCE_NATIVE=1 node dist/index.js", "dev": "NODE_PG_FORCE_NATIVE=1 NODE_ENV=development LOG_LEVEL=debug node dist/index.js", - "candles": "NODE_PG_FORCE_NATIVE=1 node dist/daemons/candles/index.js", "pairs": "NODE_PG_FORCE_NATIVE=1 node dist/daemons/pairs/index.js", "lint": "eslint src", "test": "jest --runInBand --detectOpenHandles --config=config/jest.config.unit.json", diff --git a/src/daemons/candles/Dockerfile b/src/daemons/candles/Dockerfile deleted file mode 100644 index fba0d583..00000000 --- a/src/daemons/candles/Dockerfile +++ /dev/null @@ -1,24 +0,0 @@ -FROM node:10-alpine - -# pg-native -RUN apk --no-cache add make python gcc postgresql-dev g++ - -# enable node_modules caching layer -RUN apk add --no-cache tini git -ADD package.json /tmp/package.json -ADD package-lock.json /tmp/package-lock.json -RUN cd /tmp && npm install -RUN mkdir -p /opt/app && cp -a /tmp/node_modules /opt/app - -# set work dir -WORKDIR /opt/app -ADD . /opt/app -RUN cd /opt/app -RUN npm run build - -# add tini for PID 1 handling -ENTRYPOINT ["/sbin/tini", "--"] - -# NodeJS launch -USER node -CMD ["npm", "run", "candles"] diff --git a/src/daemons/candles/README.md b/src/daemons/candles/README.md deleted file mode 100644 index fd13125c..00000000 --- a/src/daemons/candles/README.md +++ /dev/null @@ -1,37 +0,0 @@ -# Data Service Candles Daemon in Docker - -## About the image -This Docker image contains scripts and configs to run Data Service Candles Daemon. The image adds functionality to [Data Service](https://github.com/wavesplatform/data-service) to calculate and update DEX candles inside the Postgres database (see Requirements). - -Container downloads source code and configuration files from the [repository](https://github.com/wavesplatform/data-service) and runs it. - -## Requirements ⚠️ - -1. PostgreSQL 10 database with a table structure found in [wavesplatform/blockchain-postgres-sync](https://github.com/wavesplatform/blockchain-postgres-sync) -2. Downloaded and continuously updated blockchain data in the database -2. Docker for running the service in a container - -## Running the image - -The simplest way to run a container: -``` -docker run -e PGHOST=*** -e PGDATABASE=*** -e PGUSER=*** -e PGPASSWORD=*** -it wavesplatform/candles-daemon -``` - -**You can run container with environment variables:** - -|Env variable|Default|Required|Description| -|------------|-------|--------|-----------| -|`PGHOST`||YES|Postgres host address| -|`PGPORT`|`5432`|NO|Postgres port| -|`PGDATABASE`||YES|Postgres database name| -|`PGUSER`||YES|Postgres user name| -|`PGPASSWORD`||YES|Postgres password| -|`PGPOOLSIZE`|`20`|NO|Postgres pool size| -|`PGSTATEMENTTIMEOUT`|`false`|NO|Postgres statement timeout; number in ms (false = disabled)| -|`LOG_LEVEL`|`info`|NO|Log level `['info','warn','error']`| -|`CANDLES_UPDATE_INTERVAL_`|`2500`|NO|Minimum daemon update time in ms. If time is exceeded, the next iteration starts immediately| -|`CANDLES_UPDATE_TIMEOUT`|`20000`|NO|If the update time in ms is exceeded, the daemon terminates| -|`RECALCULATE_ALL_CANDLES_ON_START`|`true`|NO|Truncate all data from candles table and recalculate all candles| - -`PGPOOLSIZE` is used by the `pg-pool` library to determine Postgres connection pool size per NodeJS process instance. A good value depends on your server and db configuration and can be found empirically. You can leave it at the default value to start with. diff --git a/src/daemons/candles/create.js b/src/daemons/candles/create.js deleted file mode 100644 index 438d6bbd..00000000 --- a/src/daemons/candles/create.js +++ /dev/null @@ -1,171 +0,0 @@ -const { compose, map, nth } = require('ramda'); -const Task = require('folktale/concurrency/task'); -const { fromNullable } = require('folktale/maybe'); - -const getErrorMessage = require('../../errorHandling/getErrorMessage'); -const { CandleInterval } = require('../../types'); - -const logTaskProgress = require('../utils/logTaskProgress'); - -const { - withoutStatementTimeout, - truncateTable, - insertAllMinuteCandles, - insertAllCandles, - selectCandlesAfterTimestamp, - insertOrUpdateCandles, - selectLastCandleHeight, - selectLastExchangeTxHeight, - insertOrUpdateCandlesFromShortInterval, - selectMinTimestampFromHeight, -} = require('./sql/query'); - -/** for combining candles */ -const intervalPairs = [ - [CandleInterval.Minute1, CandleInterval.Minute5], // 1m -> 5m - [CandleInterval.Minute5, CandleInterval.Minute15], // 5m -> 15m - [CandleInterval.Minute15, CandleInterval.Minute30], // 15m -> 30m - [CandleInterval.Minute30, CandleInterval.Hour1], // 30m -> 1h - [CandleInterval.Hour1, CandleInterval.Hour2], // 1h -> 2h - [CandleInterval.Hour1, CandleInterval.Hour3], // 1h -> 3h - [CandleInterval.Hour2, CandleInterval.Hour4], // 2h -> 4h - [CandleInterval.Hour3, CandleInterval.Hour6], // 3h -> 6h - [CandleInterval.Hour6, CandleInterval.Hour12], // 6h -> 12h - [CandleInterval.Hour12, CandleInterval.Day1], // 12h -> 24h - [CandleInterval.Day1, CandleInterval.Week1], // 24h -> 1w - [CandleInterval.Day1, CandleInterval.Month1], // 24h -> 1M -]; - -/** getStartBlock :: (Object, Object) -> Number */ -const getStartBlock = (exchangeTx, candle) => { - if (candle && exchangeTx) { - if (candle.max_height > exchangeTx.height) { - return exchangeTx.height - 2000; // handle rollback - } else { - return candle.max_height - 1; - } - } - - return 1; -}; - -/** updateCandlesLoop :: (LogTask, Pg, String) -> Task */ -const updateCandlesLoop = (logTask, pg, candlesTableName) => { - const logMessages = { - start: (timeStart) => ({ - message: '[CANDLES] start updating candles', - time: timeStart, - }), - error: (e, timeTaken) => { - return { - message: '[CANDLES] update error', - time: timeTaken, - error: getErrorMessage(e), - }; - }, - success: (_, timeTaken) => ({ - message: '[CANDLES] update successful', - time: timeTaken, - }), - }; - - const pgPromiseUpdateCandles = (t, fromTimetamp) => - t - .any(selectCandlesAfterTimestamp(fromTimetamp)) - .then((candles) => - t.any(insertOrUpdateCandles(candlesTableName, candles)) - ); - - return logTask( - logMessages, - pg.tx((t) => - t - .batch([ - t.oneOrNone(selectLastExchangeTxHeight()), - t.oneOrNone(selectLastCandleHeight(candlesTableName)), - ]) - .then(([lastTx, candle]) => { - if (!lastTx) { - return new Date(); - } - const startHeight = getStartBlock(lastTx, candle); - return t - .one(selectMinTimestampFromHeight(startHeight)) - .then((row) => row.time_stamp); - }) - .then((timestamp) => { - const nextInterval = compose( - (m) => m.getOrElse(undefined), - map((interval) => - t.any( - insertOrUpdateCandlesFromShortInterval( - candlesTableName, - timestamp, - interval[0], - interval[1] - ) - ) - ), - fromNullable, - (index) => nth(index, intervalPairs) - ); - - return pgPromiseUpdateCandles(t, timestamp).then(() => - t.sequence(nextInterval) - ); - }) - ) - ); -}; - -/** fillCandlesDBAll :: (LogTask, Pg, String) -> Task */ -const fillCandlesDBAll = (logTask, pg, candlesTableName) => - logTask( - { - start: (timeStart) => ({ - message: '[DB] start filling', - time: timeStart, - }), - error: (e, timeTaken) => ({ - message: '[DB] fill error', - time: timeTaken, - error: e, - }), - success: (_, timeTaken) => ({ - message: '[DB] fill successful', - time: timeTaken, - }), - }, - pg.tx((t) => - t.batch([ - t.none(withoutStatementTimeout()), - t.any(truncateTable(candlesTableName)), - t.any(insertAllMinuteCandles(candlesTableName)), - ...intervalPairs.map(([shorter, longer]) => - t.any(insertAllCandles(candlesTableName, shorter, longer)) - ), - ]) - ) - ); - -module.exports = ({ logger, pg }, configuration) => { - const unsafeLogTaskProgress = logTaskProgress(logger); - - return { - init: () => { - if (configuration.candlesTruncateTable) - return fillCandlesDBAll( - unsafeLogTaskProgress, - pg, - configuration.candlesTableName - ); - else return Task.of(); - }, - loop: () => - updateCandlesLoop( - unsafeLogTaskProgress, - pg, - configuration.candlesTableName - ), - }; -}; diff --git a/src/daemons/candles/index.js b/src/daemons/candles/index.js deleted file mode 100644 index 79281e49..00000000 --- a/src/daemons/candles/index.js +++ /dev/null @@ -1,29 +0,0 @@ -const { loadConfig } = require('./loadConfig'); -const options = loadConfig(); - -// logger -const createLogger = require('../../logger/winston'); -const logger = createLogger({ - logLevel: 'info', -}); - -// pg -const { createPgDriver } = require('../../db'); -const pgDriver = createPgDriver(options); - -const { daemon: runDaemon } = require('../presets/daemon'); -const createDaemon = require('./create'); - -runDaemon( - createDaemon( - { - logger: logger, - pg: pgDriver, - }, - options - ), - options, - options.candlesUpdateInterval, - options.candlesUpdateTimeout, - logger -); diff --git a/src/daemons/candles/loadConfig.ts b/src/daemons/candles/loadConfig.ts deleted file mode 100644 index 35aae963..00000000 --- a/src/daemons/candles/loadConfig.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { memoizeWith, always } from 'ramda'; -import { - PostgresConfig, - LoggerConfig, - loadDefaultConfig, -} from '../../loadConfig'; - -export type CandlesConfig = PostgresConfig & - LoggerConfig & { - candlesUpdateInterval: number; - candlesUpdateTimeout: number; - candlesTruncateTable: boolean; - candlesTableName: string; - }; - -const load = (): CandlesConfig => ({ - ...loadDefaultConfig(), - candlesUpdateInterval: process.env.CANDLES_UPDATE_INTERVAL - ? parseInt(process.env.CANDLES_UPDATE_INTERVAL) - : 2500, - candlesUpdateTimeout: process.env.CANDLES_UPDATE_TIMEOUT - ? parseInt(process.env.CANDLES_UPDATE_TIMEOUT) - : 20000, - candlesTruncateTable: process.env.RECALCULATE_ALL_CANDLES_ON_START - ? process.env.RECALCULATE_ALL_CANDLES_ON_START == 'true' - : false, - candlesTableName: process.env.CANDLES_TABLE_NAME || 'candles', -}); - -export const loadConfig = memoizeWith(always('config'), load); diff --git a/src/daemons/candles/sql/__test__/__snapshots__/sql.test.js.snap b/src/daemons/candles/sql/__test__/__snapshots__/sql.test.js.snap deleted file mode 100644 index 0d470069..00000000 --- a/src/daemons/candles/sql/__test__/__snapshots__/sql.test.js.snap +++ /dev/null @@ -1,17 +0,0 @@ -// Jest Snapshot v1, https://goo.gl/fbAQLP - -exports[`candles daemon sql test calculate and insert all candles from other small candles 1`] = `"insert into \\"candles\\" select to_timestamp(floor((extract('epoch' from time_start) / 300 )) * 300) as \\"candle_time\\", \\"amount_asset_id\\" as \\"amount_asset_id\\", \\"price_asset_id\\" as \\"price_asset_id\\", (select min(\\"low\\")) as \\"low\\", (select max(\\"high\\")) as \\"high\\", (select sum(\\"volume\\")) as \\"volume\\", (select sum(\\"quote_volume\\")) as \\"quote_volume\\", (select max(\\"max_height\\")) as \\"max_height\\", (select sum(\\"txs_count\\")) as \\"txs_count\\", floor(sum((weighted_average_price * volume)::numeric)::numeric / sum(volume)::numeric)::numeric as \\"weighted_average_price\\", (array_agg(open ORDER BY time_start)::numeric[])[1] as \\"open\\", (array_agg(close ORDER BY time_start DESC)::numeric[])[1] as \\"close\\", '5m' as \\"interval\\", \\"matcher_address\\" as \\"matcher_address\\" from \\"candles\\" as \\"t\\" where \\"t\\".\\"interval\\" = '1m' group by \\"candle_time\\", \\"amount_asset_id\\", \\"price_asset_id\\", \\"matcher_address\\""`; - -exports[`candles daemon sql test get last candle height 1`] = `"select \\"max_height\\" from \\"candles\\" as \\"t\\" order by \\"max_height\\" desc limit 1"`; - -exports[`candles daemon sql test get last exchange tx height 1`] = `"select \\"height\\" from \\"txs_7\\" as \\"t\\" order by \\"uid\\" desc limit 1"`; - -exports[`candles daemon sql test insert all candles group by 1 minute 1`] = `"insert into \\"candles\\" with \\"e_cte\\" as (select \\"t\\".\\"uid\\" as \\"uid\\", \\"t\\".\\"amount_asset_id\\" as \\"amount_asset_id\\", \\"t\\".\\"price_asset_id\\" as \\"price_asset_id\\", \\"t\\".\\"sender\\" as \\"sender\\", \\"t\\".\\"height\\" as \\"height\\", date_trunc('minute', t.time_stamp) as \\"candle_time\\", \\"t\\".\\"amount\\" as \\"amount\\", \\"t\\".\\"price\\" as \\"price\\" from \\"txs_7\\" as \\"t\\") select \\"e\\".\\"candle_time\\" as \\"time_start\\", \\"amount_asset_id\\" as \\"amount_asset_id\\", \\"price_asset_id\\" as \\"price_asset_id\\", (select min(\\"e\\".\\"price\\")) as \\"low\\", (select max(\\"e\\".\\"price\\")) as \\"high\\", (select sum(\\"e\\".\\"amount\\")) as \\"volume\\", sum((e.amount)::numeric * (e.price)::numeric) as \\"quote_volume\\", (select max(\\"height\\")) as \\"max_height\\", (select count(\\"e\\".\\"price\\")) as \\"txs_count\\", floor(sum((e.amount)::numeric * (e.price)::numeric)/sum((e.amount)::numeric))::numeric as \\"weighted_average_price\\", (array_agg(e.price ORDER BY e.uid)::numeric[])[1] as \\"open\\", (array_agg(e.price ORDER BY e.uid DESC)::numeric[])[1] as \\"close\\", '1m' as \\"interval\\", \\"e\\".\\"sender\\" as \\"matcher_address\\" from \\"e_cte\\" as \\"e\\" group by e.candle_time, e.amount_asset_id, e.price_asset_id, e.sender"`; - -exports[`candles daemon sql test insert or update array of candles 1`] = `"insert into \\"candles\\" as \\"t\\" (\\"amount_asset_id\\", \\"close\\", \\"high\\", \\"interval\\", \\"low\\", \\"matcher_address\\", \\"max_height\\", \\"open\\", \\"price_asset_id\\", \\"quote_volume\\", \\"time_start\\", \\"txs_count\\", \\"volume\\", \\"weighted_average_price\\") values ('aai', '80', '100', '1m', '1', DEFAULT, DEFAULT, '20', 'pai', '100.2', '1970-01-01T00:00:00.000Z', '22', '200.2', '2.1') on conflict (time_start, amount_asset_id, price_asset_id, matcher_address, interval) do update set open=EXCLUDED.open, close=EXCLUDED.close, low=EXCLUDED.low, high=EXCLUDED.high, max_height=EXCLUDED.max_height, quote_volume=EXCLUDED.quote_volume, txs_count=EXCLUDED.txs_count, volume=EXCLUDED.volume, weighted_average_price=EXCLUDED.weighted_average_price"`; - -exports[`candles daemon sql test insert or update candles empty 1`] = `";"`; - -exports[`candles daemon sql test insert or update candles from height 1`] = `"insert into \\"candles\\" select to_timestamp(floor((extract('epoch' from time_start) / 300 )) * 300) as \\"candle_time\\", \\"amount_asset_id\\" as \\"amount_asset_id\\", \\"price_asset_id\\" as \\"price_asset_id\\", (select min(\\"low\\")) as \\"low\\", (select max(\\"high\\")) as \\"high\\", (select sum(\\"volume\\")) as \\"volume\\", (select sum(\\"quote_volume\\")) as \\"quote_volume\\", (select max(\\"max_height\\")) as \\"max_height\\", (select sum(\\"txs_count\\")) as \\"txs_count\\", floor(sum((weighted_average_price * volume)::numeric)::numeric / sum(volume)::numeric)::numeric as \\"weighted_average_price\\", (array_agg(open ORDER BY time_start)::numeric[])[1] as \\"open\\", (array_agg(close ORDER BY time_start DESC)::numeric[])[1] as \\"close\\", '5m' as \\"interval\\", \\"matcher_address\\" as \\"matcher_address\\" from \\"candles\\" where \\"interval\\" = '1m' and time_start >= to_timestamp(floor((extract('epoch' from '2019-01-01T00:00:00.000Z'::timestamptz) / 300 )) * 300) group by \\"candle_time\\", \\"amount_asset_id\\", \\"price_asset_id\\", \\"matcher_address\\" on conflict (time_start, amount_asset_id, price_asset_id, matcher_address, interval) do update set open=EXCLUDED.open, close=EXCLUDED.close, low=EXCLUDED.low, high=EXCLUDED.high, max_height=EXCLUDED.max_height, quote_volume=EXCLUDED.quote_volume, txs_count=EXCLUDED.txs_count, volume=EXCLUDED.volume, weighted_average_price=EXCLUDED.weighted_average_price"`; - -exports[`candles daemon sql test truncate table 1`] = `"truncate \\"candles\\" restart identity"`; diff --git a/src/daemons/candles/sql/__test__/sql.test.js b/src/daemons/candles/sql/__test__/sql.test.js deleted file mode 100644 index bf112dc1..00000000 --- a/src/daemons/candles/sql/__test__/sql.test.js +++ /dev/null @@ -1,64 +0,0 @@ -const { BigNumber } = require('@waves/data-entities'); -const { CandleInterval } = require('../../../../types'); -const sql = require('../query'); - -describe('candles daemon sql test', () => { - it('truncate table', () => { - expect(sql.truncateTable('candles')).toMatchSnapshot(); - }); - - it('insert all candles group by 1 minute', () => { - expect(sql.insertAllMinuteCandles('candles')).toMatchSnapshot(); - }); - - it('calculate and insert all candles from other small candles', () => { - expect(sql.insertAllCandles('candles', CandleInterval.Minute1, CandleInterval.Minute5)).toMatchSnapshot(); - }); - - it('insert or update array of candles', () => { - expect( - sql.insertOrUpdateCandles('candles', [ - { - time_start: new Date(0), - low: new BigNumber(1), - high: new BigNumber(100), - open: new BigNumber(20), - close: new BigNumber(80), - amount_asset_id: 'aai', - price_asset_id: 'pai', - price: new BigNumber(1.2), - volume: new BigNumber(200.2), - quote_volume: new BigNumber(100.2), - txs_count: new BigNumber(22), - weighted_average_price: new BigNumber(2.1), - matcher_address_uid: new BigNumber(3), - }, - ]) - ).toMatchSnapshot(); - }); - - it('insert or update candles empty', () => { - expect(sql.insertOrUpdateCandles('candles', []).toString()).toMatchSnapshot(); - }); - - it('get last candle height', () => { - expect(sql.selectLastCandleHeight('candles').toString()).toMatchSnapshot(); - }); - - it('get last exchange tx height', () => { - expect(sql.selectLastExchangeTxHeight().toString()).toMatchSnapshot(); - }); - - it('insert or update candles from height', () => { - expect( - sql - .insertOrUpdateCandlesFromShortInterval( - 'candles', - new Date('2019-01-01T00:00:00.000Z'), - CandleInterval.Minute1, - CandleInterval.Minute5 - ) - .toString() - ).toMatchSnapshot(); - }); -}); diff --git a/src/daemons/candles/sql/query.js b/src/daemons/candles/sql/query.js deleted file mode 100644 index 4ff40b7f..00000000 --- a/src/daemons/candles/sql/query.js +++ /dev/null @@ -1,203 +0,0 @@ -const knex = require('knex'); -const pg = knex({ client: 'pg' }); - -const { CandleInterval } = require('../../../types'); - -const { - pgRawDateTrunc, - makeRawTimestamp, - serializeCandle, - candlePresets, -} = require('./utils'); - -/** makeCandleCalculateColumns :: String -> Array */ -const makeCandleCalculateColumns = (interval) => { - return { - candle_time: candlePresets.aggregate.candle_time(interval), - amount_asset_id: 'amount_asset_id', - price_asset_id: 'price_asset_id', - low: candlePresets.aggregate.low, - high: candlePresets.aggregate.high, - volume: candlePresets.aggregate.volume, - quote_volume: candlePresets.aggregate.quote_volume, - max_height: candlePresets.aggregate.max_height, - txs_count: candlePresets.aggregate.txs_count, - weighted_average_price: candlePresets.aggregate.weighted_average_price, - open: candlePresets.aggregate.open, - close: candlePresets.aggregate.close, - interval: pg.raw(`'${interval}'`), - matcher_address: 'matcher_address', - }; -}; - -const candleSelectColumns = { - time_start: 'e.candle_time', - amount_asset_id: 'amount_asset_id', - price_asset_id: 'price_asset_id', - low: pg.min('e.price'), - high: pg.max('e.price'), - volume: pg.sum('e.amount'), - quote_volume: pg.raw('sum((e.amount)::numeric * (e.price)::numeric)'), - max_height: pg.max('height'), - txs_count: pg.count('e.price'), - weighted_average_price: pg.raw( - 'floor(sum((e.amount)::numeric * (e.price)::numeric)/sum((e.amount)::numeric))::numeric' - ), - open: pg.raw('(array_agg(e.price ORDER BY e.uid)::numeric[])[1]'), - close: pg.raw('(array_agg(e.price ORDER BY e.uid DESC)::numeric[])[1]'), - interval: pg.raw(`'${CandleInterval.Minute1}'`), - matcher_address: 'e.sender', -}; - -/** insertIntoCandlesFromSelect :: (String, Function) -> QueryBuilder */ -const insertIntoCandlesFromSelect = (candlesTableName, selectFunction) => - pg.into(candlesTableName).insert(selectFunction); - -/** selectExchanges :: QueryBuilder */ -const selectExchanges = pg({ t: 'txs_7' }).select({ - uid: 't.uid', - amount_asset_id: 't.amount_asset_id', - price_asset_id: 't.price_asset_id', - sender: 't.sender', - height: 't.height', - candle_time: pgRawDateTrunc('t.time_stamp')('minute'), - amount: 't.amount', - price: pg.raw(` - CASE WHEN t.tx_version > 2 - THEN t.price::numeric - * 10^(select decimals from assets where asset_id = t.price_asset_id) - * 10^(select -decimals from assets where asset_id = t.amount_asset_id) - ELSE t.price::numeric - END - `), -}); - -/** selectExchangesAfterTimestamp :: Date -> QueryBuilder */ -const selectExchangesAfterTimestamp = (fromTimestamp) => { - let ts = pgRawDateTrunc(`'${fromTimestamp.toISOString()}'::timestamptz`)('minute'); - return selectExchanges - .clone() - .whereRaw(`time_stamp >= ${ts}`) - .orderBy('uid') - .orderByRaw(`time_stamp <-> ${ts}`); -}; - -/** selectLastCandle :: String -> String query */ -const selectLastCandleHeight = (candlesTableName) => - pg({ t: candlesTableName }) - .select('max_height') - .limit(1) - .orderBy('max_height', 'desc') - .toString(); - -/** selectLastExchangeTx :: String query */ -const selectLastExchangeTxHeight = () => - pg({ t: 'txs_7' }).select('height').limit(1).orderBy('uid', 'desc').toString(); - -/** selectLastExchangeTx :: String query */ -const selectMinTimestampFromHeight = (height) => - pg('txs_7') - .column('time_stamp') - .where('height', '>=', height) - .orderBy('uid') - .limit(1) - .toString(); - -/** for make complex query with "on conflict (...) update ... without set concrete values" See insertOrUpdateCandles or insertOrUpdateCandlesFromShortInterval */ -const updatedFieldsExcluded = [ - 'open', - 'close', - 'low', - 'high', - 'max_height', - 'quote_volume', - 'txs_count', - 'volume', - 'weighted_average_price', -] - .map((field) => field + '=EXCLUDED.' + field) - .join(', '); - -/** insertOrUpdateCandles :: (String, Array[Object]) -> String query */ -const insertOrUpdateCandles = (candlesTableName, candles) => { - if (candles.length) { - return pg - .raw( - `${pg({ t: candlesTableName }).insert( - candles.map(serializeCandle) - )} on conflict (time_start, amount_asset_id, price_asset_id, matcher_address, interval) do update set ${updatedFieldsExcluded}` - ) - .toString(); - } - - return ';'; -}; - -/** insertOrUpdateCandlesFromShortInterval :: (String, Date, Number, Number) -> String query */ -const insertOrUpdateCandlesFromShortInterval = ( - candlesTableName, - fromTimestamp, - shortInterval, - longerInterval -) => - pg - .raw( - `${insertIntoCandlesFromSelect(candlesTableName, function () { - this.from(candlesTableName) - .select(makeCandleCalculateColumns(longerInterval)) - .where('interval', shortInterval) - .whereRaw( - pg.raw(`time_start >= ${makeRawTimestamp(fromTimestamp, longerInterval)}`) - ) - .groupBy('candle_time', 'amount_asset_id', 'price_asset_id', 'matcher_address'); - })} on conflict (time_start, amount_asset_id, price_asset_id, matcher_address, interval) do update set ${updatedFieldsExcluded}` - ) - .toString(); - -/** - * SET statement_timeout = 0 - * @returns string - */ -const withoutStatementTimeout = () => pg.raw('SET statement_timeout = 0').toString(); - -/** truncateTable :: String -> String query */ -const truncateTable = (candlesTableName) => pg(candlesTableName).truncate().toString(); - -/** insertAllMinuteCandles :: String -> String query */ -const insertAllMinuteCandles = (candlesTableName) => - insertIntoCandlesFromSelect(candlesTableName, function () { - this.with('e_cte', selectExchanges) - .select(candleSelectColumns) - .from({ e: 'e_cte' }) - .groupByRaw('e.candle_time, e.amount_asset_id, e.price_asset_id, e.sender'); - }).toString(); - -/** insertAllCandles :: (String, Number, Number, Number) -> String query */ -const insertAllCandles = (candlesTableName, shortInterval, longerInterval) => - insertIntoCandlesFromSelect(candlesTableName, function () { - this.from({ t: candlesTableName }) - .column(makeCandleCalculateColumns(longerInterval)) - .where('t.interval', shortInterval) - .groupBy(['candle_time', 'amount_asset_id', 'price_asset_id', 'matcher_address']); - }).toString(); - -/** selectCandlesAfterTimestamp :: Date -> String query */ -const selectCandlesAfterTimestamp = (timetamp) => - pg - .columns(candleSelectColumns) - .from(selectExchangesAfterTimestamp(timetamp).clone().as('e')) - .groupBy(['e.candle_time', 'e.amount_asset_id', 'e.price_asset_id', 'e.sender']) - .toString(); - -module.exports = { - withoutStatementTimeout, - truncateTable, - insertAllMinuteCandles, - insertAllCandles, - selectCandlesAfterTimestamp, - insertOrUpdateCandles, - selectLastCandleHeight, - selectLastExchangeTxHeight, - insertOrUpdateCandlesFromShortInterval, - selectMinTimestampFromHeight, -}; diff --git a/src/daemons/candles/sql/utils.js b/src/daemons/candles/sql/utils.js deleted file mode 100644 index 0e4ddfa3..00000000 --- a/src/daemons/candles/sql/utils.js +++ /dev/null @@ -1,111 +0,0 @@ -const knex = require('knex'); -const pg = knex({ client: 'pg' }); - -const { CandleInterval } = require('../../../types'); - -/** - * - * @param {string} from - */ -const pgRawExtractFromToTimestamp = (from) => /** @param {string} interval */ ( - interval -) => - pg.raw( - `to_timestamp(floor((extract('epoch' from ${from}) / ${interval} )) * ${interval})` - ); - -/** - * - * @param {string} from - */ -const pgRawDateTrunc = (from) => /** @param {string} interval */ (interval) => - pg.raw(`date_trunc('${interval}', ${from})`); - -/** - * - * @param {string} from - * @param {keyof CandleInterval} interval - */ -const toRawTimestamp = (from, interval) => { - const nf = pgRawExtractFromToTimestamp(from); - const sf = pgRawDateTrunc(from); - - switch (interval) { - case CandleInterval.Minute1: - return nf(60); - case CandleInterval.Minute5: - return nf(300); - case CandleInterval.Minute15: - return nf(900); - case CandleInterval.Minute30: - return nf(1800); - case CandleInterval.Hour1: - return nf(3600); - case CandleInterval.Hour2: - return nf(7200); - case CandleInterval.Hour3: - return nf(10800); - case CandleInterval.Hour4: - return nf(14400); - case CandleInterval.Hour6: - return nf(21600); - case CandleInterval.Hour12: - return nf(43200); - case CandleInterval.Day1: - return nf(86400); - case CandleInterval.Week1: - return sf('week'); - case CandleInterval.Month1: - return sf('month'); - } -}; - -/** - * @param {Date} timestamp - * @param {string} interval - */ -const makeRawTimestamp = (timestamp, interval) => - toRawTimestamp(`'${timestamp.toISOString()}'::timestamptz`, interval); - -// serializeCandle:: Object => Object -// @todo refactor after pg updating for work with BigInt instead of BigNumber -const serializeCandle = (candle) => ({ - time_start: candle.time_start.toISOString(), - amount_asset_id: candle.amount_asset_id, - price_asset_id: candle.price_asset_id, - matcher_address: candle.matcher_address, - low: candle.low.toString(), - high: candle.high.toString(), - volume: candle.volume.toString(), - quote_volume: candle.quote_volume.toString(), - max_height: candle.max_height, - txs_count: candle.txs_count.toString(), - weighted_average_price: candle.weighted_average_price.toString(), - open: candle.open.toString(), - close: candle.close.toString(), - interval: CandleInterval.Minute1, -}); - -const candlePresets = { - aggregate: { - candle_time: (interval) => toRawTimestamp('time_start', interval), - low: pg.min('low'), - high: pg.max('high'), - volume: pg.sum('volume'), - quote_volume: pg.sum('quote_volume'), - max_height: pg.max('max_height'), - txs_count: pg.sum('txs_count'), - weighted_average_price: pg.raw( - 'floor(sum((weighted_average_price * volume)::numeric)::numeric / sum(volume)::numeric)::numeric' - ), - open: pg.raw('(array_agg(open ORDER BY time_start)::numeric[])[1]'), - close: pg.raw('(array_agg(close ORDER BY time_start DESC)::numeric[])[1]'), - }, -}; - -module.exports = { - pgRawDateTrunc, - makeRawTimestamp, - serializeCandle, - candlePresets, -}; diff --git a/src/daemons/pairs/sql/fillTable.ts b/src/daemons/pairs/sql/fillTable.ts index e35b79a3..ffc541ea 100644 --- a/src/daemons/pairs/sql/fillTable.ts +++ b/src/daemons/pairs/sql/fillTable.ts @@ -10,8 +10,8 @@ const selectExchanges = pg({ t: 'txs_7' }) price: pg.raw(` CASE WHEN t.tx_version > 2 THEN t.price::numeric - * 10^(select decimals from assets where asset_id = t.price_asset_id) - * 10^(select -decimals from assets where asset_id = t.amount_asset_id) + * 10^(select decimals from decimals where asset_id = t.price_asset_id) + * 10^(select -decimals from decimals where asset_id = t.amount_asset_id) ELSE t.price::numeric END `), @@ -27,12 +27,8 @@ const selectPairsCTE = pg qb.select({ amount_asset_id: 'amount_asset_id', price_asset_id: 'price_asset_id', - last_price: pg.raw( - '(array_agg(e.price ORDER BY e.uid DESC)::numeric[])[1]' - ), - first_price: pg.raw( - '(array_agg(e.price ORDER BY e.uid)::numeric[])[1]' - ), + last_price: pg.raw('(array_agg(e.price ORDER BY e.uid DESC)::numeric[])[1]'), + first_price: pg.raw('(array_agg(e.price ORDER BY e.uid)::numeric[])[1]'), volume: pg.raw('sum(e.amount)'), quote_volume: pg.raw('sum(e.amount::numeric * e.price::numeric)'), weighted_average_price: pg.raw( @@ -57,7 +53,9 @@ const selectPairsCTE = pg 'p.last_price', 'p.volume', { - volume_waves: pg.raw('COALESCE(p.volume_waves, floor(p.quote_volume / p1.weighted_average_price), p.quote_volume * p2.weighted_average_price)'), + volume_waves: pg.raw( + 'COALESCE(p.volume_waves, floor(p.quote_volume / p1.weighted_average_price), p.quote_volume * p2.weighted_average_price)' + ), }, 'p.quote_volume', 'p.high', diff --git a/src/http/transactions/index.ts b/src/http/transactions/index.ts index fb5c7331..58da626d 100644 --- a/src/http/transactions/index.ts +++ b/src/http/transactions/index.ts @@ -2,11 +2,7 @@ import * as Router from 'koa-router'; import { ServiceMesh } from '../../services'; import commonFilters from '../_common/filters/filters'; import { Parser } from '../_common/filters/types'; -import { - createTransactionHttpHandlers, - parseGet, - parseMgetOrSearch, -} from './_common'; +import { createTransactionHttpHandlers, parseGet, parseMgetOrSearch } from './_common'; import { parseDataMgetOrSearch } from './parseDataMgetOrSearch'; const createParseRequest = ( @@ -22,81 +18,82 @@ export default (txsServices: ServiceMesh['transactions']) => { const all = createTransactionHttpHandlers( new Router(), '/transactions/all', - txsServices['all'], + txsServices.all, createParseRequest() ); - const alias = createTransactionHttpHandlers( + const genesis = createTransactionHttpHandlers( new Router(), - '/transactions/alias', - txsServices['alias'], - createParseRequest() + '/transactions/genesis', + txsServices.genesis, + createParseRequest({ + recipient: commonFilters.query, + }) ); - const burn = createTransactionHttpHandlers( + const payment = createTransactionHttpHandlers( new Router(), - '/transactions/burn', - txsServices['burn'], + '/transactions/payment', + txsServices.payment, createParseRequest({ - assetId: commonFilters.query, + recipient: commonFilters.query, }) ); - const data = createTransactionHttpHandlers( + const issue = createTransactionHttpHandlers( new Router(), - '/transactions/data', - txsServices['data'], - { - get: parseGet, - mgetOrSearch: parseDataMgetOrSearch, - } + '/transactions/issue', + txsServices.issue, + createParseRequest({ + assetId: commonFilters.query, + script: commonFilters.query, + }) ); - const exchange = createTransactionHttpHandlers( + const transfer = createTransactionHttpHandlers( new Router(), - '/transactions/exchange', - txsServices['exchange'], + '/transactions/transfer', + txsServices.transfer, createParseRequest({ - amountAsset: commonFilters.query, - matcher: commonFilters.query, - orderId: commonFilters.query, - priceAsset: commonFilters.query, + assetId: commonFilters.query, + recipient: commonFilters.query, }) ); - const genesis = createTransactionHttpHandlers( + const reissue = createTransactionHttpHandlers( new Router(), - '/transactions/genesis', - txsServices['genesis'], + '/transactions/reissue', + txsServices.reissue, createParseRequest({ - recipient: commonFilters.query, + assetId: commonFilters.query, }) ); - const invokeScript = createTransactionHttpHandlers( + const burn = createTransactionHttpHandlers( new Router(), - '/transactions/invoke-script', - txsServices['invokeScript'], + '/transactions/burn', + txsServices.burn, createParseRequest({ - dapp: commonFilters.query, - function: commonFilters.query, + assetId: commonFilters.query, }) ); - const issue = createTransactionHttpHandlers( + const exchange = createTransactionHttpHandlers( new Router(), - '/transactions/issue', - txsServices['issue'], + '/transactions/exchange', + txsServices.exchange, createParseRequest({ - assetId: commonFilters.query, - script: commonFilters.query, + amountAsset: commonFilters.query, + matcher: commonFilters.query, + orderId: commonFilters.query, + priceAsset: commonFilters.query, }) ); const lease = createTransactionHttpHandlers( new Router(), '/transactions/lease', - txsServices['lease'], + txsServices.lease, createParseRequest({ recipient: commonFilters.query, }) @@ -105,35 +102,52 @@ export default (txsServices: ServiceMesh['transactions']) => { const leaseCancel = createTransactionHttpHandlers( new Router(), '/transactions/lease-cancel', - txsServices['leaseCancel'], + txsServices.leaseCancel, createParseRequest({ recipient: commonFilters.query, }) ); + const alias = createTransactionHttpHandlers( + new Router(), + '/transactions/alias', + txsServices.alias, + createParseRequest() + ); + const massTransfer = createTransactionHttpHandlers( new Router(), '/transactions/mass-transfer', - txsServices['massTransfer'], + txsServices.massTransfer, createParseRequest({ assetId: commonFilters.query, recipient: commonFilters.query, }) ); - const payment = createTransactionHttpHandlers( + const data = createTransactionHttpHandlers( new Router(), - '/transactions/payment', - txsServices['payment'], + '/transactions/data', + txsServices.data, + { + get: parseGet, + mgetOrSearch: parseDataMgetOrSearch, + } + ); + + const setScript = createTransactionHttpHandlers( + new Router(), + '/transactions/set-script', + txsServices.setScript, createParseRequest({ - recipient: commonFilters.query, + script: commonFilters.query, }) ); - const reissue = createTransactionHttpHandlers( + const sponsorship = createTransactionHttpHandlers( new Router(), - '/transactions/reissue', - txsServices['reissue'], + '/transactions/sponsorship', + txsServices.sponsorship, createParseRequest({ assetId: commonFilters.query, }) @@ -142,49 +156,42 @@ export default (txsServices: ServiceMesh['transactions']) => { const setAssetScript = createTransactionHttpHandlers( new Router(), '/transactions/set-asset-script', - txsServices['setAssetScript'], + txsServices.setAssetScript, createParseRequest({ assetId: commonFilters.query, script: commonFilters.query, }) ); - const setScript = createTransactionHttpHandlers( + const invokeScript = createTransactionHttpHandlers( new Router(), - '/transactions/set-script', - txsServices['setScript'], + '/transactions/invoke-script', + txsServices.invokeScript, createParseRequest({ - script: commonFilters.query, + dapp: commonFilters.query, + function: commonFilters.query, }) ); - const sponsorship = createTransactionHttpHandlers( + const updateAssetInfo = createTransactionHttpHandlers( new Router(), - '/transactions/sponsorship', - txsServices['sponsorship'], + '/transactions/update-asset-info', + txsServices.updateAssetInfo, createParseRequest({ assetId: commonFilters.query, }) ); - const transfer = createTransactionHttpHandlers( + const ethereumLike = createTransactionHttpHandlers( new Router(), - '/transactions/transfer', - txsServices['transfer'], + '/transactions/ethereum-like', + txsServices['ethereumLike'], createParseRequest({ - assetId: commonFilters.query, - recipient: commonFilters.query, + type: commonFilters.query, + function: commonFilters.query, }) ); - const updateAssetInfo = createTransactionHttpHandlers( - new Router(), - '/transactions/update-asset-info', - txsServices['updateAssetInfo'], - createParseRequest({ - assetId: commonFilters.query, - }) - ); return subrouter.use( alias.routes(), all.routes(), @@ -204,5 +211,6 @@ export default (txsServices: ServiceMesh['transactions']) => { sponsorship.routes(), transfer.routes(), updateAssetInfo.routes(), + ethereumLike.routes() ); }; diff --git a/src/services/_common/createResolver/types.d.ts b/src/services/_common/createResolver/types.d.ts index cd1b238f..b63f6ae6 100644 --- a/src/services/_common/createResolver/types.d.ts +++ b/src/services/_common/createResolver/types.d.ts @@ -1,12 +1,7 @@ import { Task } from 'folktale/concurrency/task'; import { Result } from 'folktale/result'; import { Maybe } from 'folktale/maybe'; -import { - ValidationError, - ResolverError, - DbError, - Timeout, -} from '../../../errorHandling'; +import { ValidationError, ResolverError, DbError, Timeout } from '../../../errorHandling'; import { SearchedItems } from '../../../types'; @@ -15,46 +10,20 @@ export type EmitEvent = (name: string) => (object: A) => void; export type ValidateSync = (value: Value) => Result; export type ValidateAsync = (value: Value) => Task; -type CommonResolverDependencies< - ReqRaw, - ReqTransformed, - ResRaw, - ResTransformed - > = { - transformInput: (r: ReqRaw) => Result; - validateResult: ValidateSync; - emitEvent: EmitEvent; - }; +type CommonResolverDependencies = { + transformInput: (r: ReqRaw) => Result; + validateResult: ValidateSync; + emitEvent: EmitEvent; +}; -export type GetResolverDependencies< - ReqRaw, - ReqTransformed, - ResRaw, - ResTransformed - > = CommonResolverDependencies< - ReqRaw, - ReqTransformed, - ResRaw, - ResTransformed - > & { +export type GetResolverDependencies = + CommonResolverDependencies & { getData: (r: ReqTransformed) => Task>; - transformResult: ( - result: Maybe, - request: ReqRaw - ) => Maybe; + transformResult: (result: Maybe, request: ReqRaw) => Maybe; }; -export type MgetResolverDependencies< - ReqRaw, - ReqTransformed, - ResRaw, - ResTransformed - > = CommonResolverDependencies< - ReqRaw, - ReqTransformed, - ResRaw, - ResTransformed - > & { +export type MgetResolverDependencies = + CommonResolverDependencies & { getData: (r: ReqTransformed) => Task[]>; transformResult: ( result: Maybe[], @@ -62,17 +31,8 @@ export type MgetResolverDependencies< ) => Maybe[]; }; -export type SearchResolverDependencies< - ReqRaw, - ReqTransformed, - ResRaw, - ResTransformed - > = CommonResolverDependencies< - ReqRaw, - ReqTransformed, - ResRaw, - ResTransformed - > & { +export type SearchResolverDependencies = + CommonResolverDependencies & { getData: (r: ReqTransformed) => Task; transformResult: ( results: ResRaw[], diff --git a/src/services/index.ts b/src/services/index.ts index 97a9c6b4..a92bfda5 100644 --- a/src/services/index.ts +++ b/src/services/index.ts @@ -83,6 +83,10 @@ import createTransferTxsRepo from './transactions/transfer/repo'; import createUpdateAssetInfoTxsService from './transactions/updateAssetInfo'; import { UpdateAssetInfoTxsService } from './transactions/updateAssetInfo/types'; import createUpdateAssetInfoTxsRepo from './transactions/updateAssetInfo/repo'; +// ethereum-like txs +import createEthereumLikeTxsService from './transactions/ethereumLike'; +import { EthereumLikeTxsService } from './transactions/ethereumLike/types'; +import createEthereumLikeTxsRepo from './transactions/ethereumLike/repo'; import createRateService, { RateCacheImpl, RatesMgetService } from './rates'; import { RateCache } from './rates/repo'; @@ -149,6 +153,7 @@ export type ServiceMesh = { sponsorship: SponsorshipTxsService; transfer: TransferTxsService; updateAssetInfo: UpdateAssetInfoTxsService; + ethereumLike: EthereumLikeTxsService; }; }; @@ -251,6 +256,8 @@ export default ({ updateAssetInfoRepo, assets ); + const ethereumLikeRepo = createEthereumLikeTxsRepo(commonDeps); + const ethereumLikeTxs = createEthereumLikeTxsService(ethereumLikeRepo, assets); const rateRepo = new RemoteRateRepo(commonDeps.drivers.pg); @@ -299,6 +306,7 @@ export default ({ 15: setAssetScriptTxs, 16: invokeScriptTxs, 17: updateAssetInfoTxs, + 18: ethereumLikeTxs, }); return { @@ -325,6 +333,7 @@ export default ({ setAssetScript: setAssetScriptTxs, invokeScript: invokeScriptTxs, updateAssetInfo: updateAssetInfoTxs, + ethereumLike: ethereumLikeTxs, }, matchers: { rates, diff --git a/src/services/transactions/_common/commonFieldsSchemas.ts b/src/services/transactions/_common/commonFieldsSchemas.ts index 0a6ee81b..0a9c6006 100644 --- a/src/services/transactions/_common/commonFieldsSchemas.ts +++ b/src/services/transactions/_common/commonFieldsSchemas.ts @@ -1,35 +1,16 @@ import { Joi } from '../../../utils/validation'; export default { - uid: Joi.object() - .bignumber() - .required(), - id: Joi.string() - .base58() - .required(), + uid: Joi.object().bignumber().required(), + id: Joi.string().base58().required(), height: Joi.number().required(), - tx_type: Joi.number() - .min(1) - .max(17) - .required(), - tx_version: Joi.number() - .required() - .allow(null), - fee: Joi.object() - .bignumber() - .required(), + tx_type: Joi.number().min(1).max(18).required(), + tx_version: Joi.number().required().allow(null), + fee: Joi.object().bignumber().required(), time_stamp: Joi.date().required(), - signature: Joi.string() - .base58() - .required() - .allow(null), + signature: Joi.string().base58().required().allow(null), proofs: Joi.array().required(), status: Joi.string().required(), - - sender: Joi.string() - .base58() - .required(), - sender_public_key: Joi.string() - .base58() - .required(), + sender: Joi.string().base58().required(), + sender_public_key: Joi.string().base58().required(), }; diff --git a/src/services/transactions/all/index.ts b/src/services/transactions/all/index.ts index ff4a5102..390d7ff7 100644 --- a/src/services/transactions/all/index.ts +++ b/src/services/transactions/all/index.ts @@ -29,6 +29,7 @@ import { SponsorshipTxsService } from '../sponsorship/types'; import { SetAssetScriptTxsService } from '../setAssetScript/types'; import { InvokeScriptTxsService } from '../invokeScript/types'; import { UpdateAssetInfoTxsService } from '../updateAssetInfo/types'; +import { EthereumLikeTxsService } from '../ethereumLike/types'; import { AllTxsRepo, AllTxsGetRequest, @@ -56,6 +57,7 @@ type AllTxsServiceDep = { 15: SetAssetScriptTxsService; 16: InvokeScriptTxsService; 17: UpdateAssetInfoTxsService; + 18: EthereumLikeTxsService; }; export type AllTxsServiceGetRequest = ServiceGetRequest; @@ -63,14 +65,8 @@ export type AllTxsServiceMgetRequest = ServiceMgetRequest; export type AllTxsServiceSearchRequest = AllTxsSearchRequest; export type AllTxsService = { - get: Service< - AllTxsServiceGetRequest & WithMoneyFormat, - Maybe - >; - mget: Service< - AllTxsServiceMgetRequest & WithMoneyFormat, - Maybe[] - >; + get: Service>; + mget: Service[]>; search: Service< AllTxsServiceSearchRequest & WithMoneyFormat, SearchedItems @@ -81,83 +77,80 @@ export type AllTxsService = { // request by (id, timestamp) instead of just id // to ensure correct tx response even if // id is duplicated (happens in payment, alias txs) -export default (repo: AllTxsRepo) => ( - txsServices: AllTxsServiceDep -): AllTxsService => ({ - get: (req) => - repo - .get(req.id) //Task tx - .chain((m) => - m.matchWith({ - Just: ({ value }) => { - return txsServices[value.type as keyof AllTxsServiceDep].get({ - id: value.id, - moneyFormat: req.moneyFormat, - }); - }, - Nothing: () => taskOf(emptyOf()), - }) - ), +export default (repo: AllTxsRepo) => + (txsServices: AllTxsServiceDep): AllTxsService => ({ + get: (req) => + repo + .get(req.id) //Task tx + .chain((m) => + m.matchWith({ + Just: ({ value }) => { + return txsServices[value.type as keyof AllTxsServiceDep].get({ + id: value.id, + moneyFormat: req.moneyFormat, + }); + }, + Nothing: () => taskOf(emptyOf()), + }) + ), - mget: (req) => - repo - .mget(req.ids) // Task tx[]. tx can have data: null - .chain((txsList: Maybe[]) => - waitAll( - txsList.map((m) => - m.matchWith({ - Just: ({ value }) => { - return txsServices[value.type as keyof AllTxsServiceDep].get({ - id: value.id, + mget: (req) => + repo + .mget(req.ids) // Task tx[]. tx can have data: null + .chain((txsList: Maybe[]) => + waitAll( + txsList.map((m) => + m.matchWith({ + Just: ({ value }) => { + return txsServices[value.type as keyof AllTxsServiceDep].get({ + id: value.id, + moneyFormat: req.moneyFormat, + }); + }, + Nothing: () => taskOf(emptyOf()), + }) + ) + ) + ), + + search: (req) => + repo.search(req).chain((txsList: SearchedItems) => + waitAll[]>( + pipe< + CommonTransactionInfo[], + Record, + [string, CommonTransactionInfo[]][], + Task[]>[] + >( + groupBy((t) => String(t.type)), + toPairs, + (tuples) => + tuples.map(([type, txs]) => { + return txsServices[type as unknown as keyof AllTxsServiceDep].mget({ + ids: txs.map((t) => t.id), moneyFormat: req.moneyFormat, }); - }, - Nothing: () => taskOf(emptyOf()), - }) - ) + }) + )(txsList.items) ) + .map((mss) => flatten>(mss)) + .map(collect((m) => m.getOrElse(undefined))) + .map((txs) => { + const s = indexBy( + (tx) => `${tx.id}:${tx.timestamp.valueOf()}`, + txsList.items + ); + return sort((a, b) => { + const aTxUid = s[`${a.id}:${a.timestamp.valueOf()}`]['txUid']; + const bTxUid = s[`${b.id}:${b.timestamp.valueOf()}`]['txUid']; + return req.sort === SortOrder.Ascending + ? aTxUid.minus(bTxUid).toNumber() + : bTxUid.minus(aTxUid).toNumber(); + }, txs); + }) + .map((txs) => ({ + ...txsList, + items: txs, + })) ), - - search: (req) => - repo.search(req).chain((txsList: SearchedItems) => - waitAll[]>( - pipe< - CommonTransactionInfo[], - Record, - [string, CommonTransactionInfo[]][], - Task[]>[] - >( - groupBy((t) => String(t.type)), - toPairs, - (tuples) => - tuples.map(([type, txs]) => { - return txsServices[ - (type as unknown) as keyof AllTxsServiceDep - ].mget({ - ids: txs.map((t) => t.id), - moneyFormat: req.moneyFormat, - }); - }) - )(txsList.items) - ) - .map((mss) => flatten>(mss)) - .map(collect((m) => m.getOrElse(undefined))) - .map((txs) => { - const s = indexBy( - (tx) => `${tx.id}:${tx.timestamp.valueOf()}`, - txsList.items - ); - return sort((a, b) => { - const aTxUid = s[`${a.id}:${a.timestamp.valueOf()}`]['txUid']; - const bTxUid = s[`${b.id}:${b.timestamp.valueOf()}`]['txUid']; - return req.sort === SortOrder.Ascending - ? aTxUid.minus(bTxUid).toNumber() - : bTxUid.minus(aTxUid).toNumber(); - }, txs); - }) - .map((txs) => ({ - ...txsList, - items: txs, - })) - ), -}); + }); diff --git a/src/services/transactions/all/repo/schema.ts b/src/services/transactions/all/repo/schema.ts index de9fa1b6..3e6c4fdb 100644 --- a/src/services/transactions/all/repo/schema.ts +++ b/src/services/transactions/all/repo/schema.ts @@ -2,7 +2,7 @@ import { Joi } from '../../../../utils/validation'; export const result = Joi.object().keys({ uid: Joi.object().bignumber().required(), - tx_type: Joi.number().min(1).max(17).required(), + tx_type: Joi.number().min(1).max(18).required(), time_stamp: Joi.date().required(), id: Joi.string().base58().required(), }); diff --git a/src/services/transactions/ethereumLike/index.ts b/src/services/transactions/ethereumLike/index.ts new file mode 100644 index 00000000..f2508d35 --- /dev/null +++ b/src/services/transactions/ethereumLike/index.ts @@ -0,0 +1,12 @@ +import { withDecimalsProcessing } from '../../_common/transformation/withDecimalsProcessing'; +import { modifyFeeDecimals } from '../_common/modifyFeeDecimals'; +import { EthereumLikeTxsRepo } from './repo/types'; +import { AssetsService } from '../../assets'; +import { createService } from '../_common/createService'; +import { EthereumLikeTxsService } from './types'; + +export default ( + repo: EthereumLikeTxsRepo, + assetsService: AssetsService +): EthereumLikeTxsService => + withDecimalsProcessing(modifyFeeDecimals(assetsService), createService(repo)); diff --git a/src/services/transactions/ethereumLike/repo/index.ts b/src/services/transactions/ethereumLike/repo/index.ts new file mode 100644 index 00000000..b8839399 --- /dev/null +++ b/src/services/transactions/ethereumLike/repo/index.ts @@ -0,0 +1,65 @@ +import { propEq } from 'ramda'; + +import { CommonRepoDependencies } from '../../..'; +import { getByIdPreset } from '../../../_common/presets/pg/getById'; +import { mgetByIdsPreset } from '../../../_common/presets/pg/mgetByIds'; +import { searchPreset } from '../../../_common/presets/pg/search'; + +import { Cursor, serialize, deserialize } from '../../_common/cursor'; +import transformTxInfo from './transformTxInfo'; + +import { result as resultSchema } from './schema'; +import * as sql from './sql'; +import { + EthereumLikeTxsRepo, + EthereumLikeTxsSearchRequest, + EthereumLikeTxDbResponse, + EthereumLikeTx, +} from './types'; + +export default ({ + drivers: { pg }, + emitEvent, +}: CommonRepoDependencies): EthereumLikeTxsRepo => { + return { + get: getByIdPreset({ + name: 'transactions.ethereumLike.get', + sql: sql.get, + resultSchema, + transformResult: transformTxInfo, + })({ + pg, + emitEvent, + }), + + mget: mgetByIdsPreset({ + name: 'transactions.ethereumLike.mget', + matchRequestResult: propEq('id'), + sql: sql.mget, + resultSchema, + transformResult: transformTxInfo, + })({ + pg, + emitEvent, + }), + + search: searchPreset< + Cursor, + EthereumLikeTxsSearchRequest, + EthereumLikeTxDbResponse, + EthereumLikeTx + >({ + name: 'transactions.ethereumLike.search', + sql: sql.search, + resultSchema, + transformResult: transformTxInfo, + cursorSerialization: { + serialize, + deserialize, + }, + })({ + pg, + emitEvent, + }), + }; +}; diff --git a/src/services/transactions/ethereumLike/repo/schema.ts b/src/services/transactions/ethereumLike/repo/schema.ts new file mode 100644 index 00000000..95aa2e95 --- /dev/null +++ b/src/services/transactions/ethereumLike/repo/schema.ts @@ -0,0 +1,8 @@ +import { Joi } from '../../../../utils/validation'; +import commonFields from '../../_common/commonFieldsSchemas'; + +export const result = Joi.object().keys({ + ...commonFields, + bytes: Joi.binary().required(), + function_name: Joi.string().required().allow(null), +}); diff --git a/src/services/transactions/ethereumLike/repo/sql/filters.js b/src/services/transactions/ethereumLike/repo/sql/filters.js new file mode 100644 index 00000000..4b01f9a9 --- /dev/null +++ b/src/services/transactions/ethereumLike/repo/sql/filters.js @@ -0,0 +1,42 @@ +const { where, whereNull, whereNotNull } = require('../../../../../utils/db/knex'); + +const { createByTimeStamp, createByBlockTimeStamp } = require('../../../_common/sql'); +const { + id, + ids, + sender, + senders, + sort, + after, + limit, +} = require('../../../_common/sql/filters'); +const commonFiltersOrder = require('../../../_common/sql/filtersOrder'); + +const byTimeStamp = createByTimeStamp('txs_18'); + +const byBlockTimeStamp = createByBlockTimeStamp('txs_18'); + +module.exports = { + filters: { + id, + ids, + sender, + senders, + sort, + after, + limit, + + type: (transferOrInvocation) => + transferOrInvocation === 'transfer' + ? whereNull('function_name') + : whereNotNull('function_name'), + + function: (functionName) => where('function_name', functionName), + + timeStart: byTimeStamp('>='), + timeEnd: byTimeStamp('<='), + blockTimeStart: byBlockTimeStamp('>='), + blockTimeEnd: byBlockTimeStamp('<='), + }, + filtersOrder: [...commonFiltersOrder, 'timeStart', 'timeEnd', 'type', 'function'], +}; diff --git a/src/services/transactions/ethereumLike/repo/sql/index.js b/src/services/transactions/ethereumLike/repo/sql/index.js new file mode 100644 index 00000000..729210ce --- /dev/null +++ b/src/services/transactions/ethereumLike/repo/sql/index.js @@ -0,0 +1,17 @@ +const { createSql } = require('../../../_common/sql'); + +const { select, selectFromFiltered } = require('./query'); +const { filters, filtersOrder } = require('./filters'); + +const queryAfterFilters = { + get: selectFromFiltered, + mget: selectFromFiltered, + search: selectFromFiltered, +}; + +module.exports = createSql({ + query: select, + filters, + filtersOrder, + queryAfterFilters, +}); diff --git a/src/services/transactions/ethereumLike/repo/sql/query.js b/src/services/transactions/ethereumLike/repo/sql/query.js new file mode 100644 index 00000000..19dd3e70 --- /dev/null +++ b/src/services/transactions/ethereumLike/repo/sql/query.js @@ -0,0 +1,24 @@ +const pg = require('knex')({ client: 'pg' }); + +const select = pg({ t: 'txs_18' }); + +const selectFromFiltered = (filtered) => + filtered.column({ + uid: 't.uid', + height: 't.height', + tx_type: 't.tx_type', + id: 't.id', + time_stamp: 't.time_stamp', + signature: 't.signature', + proofs: 't.proofs', + tx_version: 't.tx_version', + fee: 't.fee', + status: 't.status', + sender: 't.sender', + sender_public_key: 't.sender_public_key', + + bytes: 't.bytes', + function_name: 't.function_name', + }); + +module.exports = { select, selectFromFiltered }; diff --git a/src/services/transactions/ethereumLike/repo/transformTxInfo.ts b/src/services/transactions/ethereumLike/repo/transformTxInfo.ts new file mode 100644 index 00000000..16c6ae0f --- /dev/null +++ b/src/services/transactions/ethereumLike/repo/transformTxInfo.ts @@ -0,0 +1,25 @@ +import { compose, evolve } from 'ramda'; +import { renameKeys } from 'ramda-adjunct'; +import { transformTxInfo } from '../../_common/transformTxInfo'; +import { EthereumLikeTxPayload } from './types'; + +const functionNameToPayload = (functionName: string | null): EthereumLikeTxPayload => + functionName === null + ? { type: 'transfer' } + : { + type: 'invocation', + call: { + function: functionName, + }, + }; + +const bufferToETHHex = (b: Buffer) => '0x' + b.toString('hex'); + +export default compose( + transformTxInfo, + evolve({ + payload: functionNameToPayload, + bytes: bufferToETHHex, + }) as any, + renameKeys({ function_name: 'payload' }) +); diff --git a/src/services/transactions/ethereumLike/repo/types.ts b/src/services/transactions/ethereumLike/repo/types.ts new file mode 100644 index 00000000..0eb08686 --- /dev/null +++ b/src/services/transactions/ethereumLike/repo/types.ts @@ -0,0 +1,41 @@ +import { Repo } from '../../../../types'; +import { WithSortOrder, WithLimit } from '../../../_common'; +import { RequestWithCursor } from '../../../_common/pagination'; +import { CommonFilters, RawTx, Tx } from '../../_common/types'; + +export type EthereumLikeTransfer = 'transfer'; +export type EthereumLikeInvocation = 'invocation'; + +export type EthereumLikeTxDbResponse = RawTx & { + payload: string; + function_name: string | null; +}; + +export type EthereumLikeTxPayload = + | { type: EthereumLikeTransfer } + | { type: EthereumLikeInvocation; call: { function: string } }; + +export type EthereumLikeTx = Tx & { + bytes: string; + payload: EthereumLikeTxPayload; +}; + +export type EthereumLikeTxsGetRequest = string; + +export type EthereumLikeTxsMgetRequest = string[]; + +export type EthereumLikeTxsSearchRequest = RequestWithCursor< + CommonFilters & WithSortOrder & WithLimit, + string +> & + Partial<{ + type: EthereumLikeTransfer | EthereumLikeInvocation; + function: string; + }>; + +export type EthereumLikeTxsRepo = Repo< + EthereumLikeTxsGetRequest, + EthereumLikeTxsMgetRequest, + EthereumLikeTxsSearchRequest, + EthereumLikeTx +>; diff --git a/src/services/transactions/ethereumLike/types.ts b/src/services/transactions/ethereumLike/types.ts new file mode 100644 index 00000000..b96cf7bb --- /dev/null +++ b/src/services/transactions/ethereumLike/types.ts @@ -0,0 +1,30 @@ +import { Maybe } from 'folktale/maybe'; +import { + Service, + SearchedItems, + ServiceGetRequest, + ServiceMgetRequest, +} from '../../../types'; +import { WithMoneyFormat } from '../../types'; +import { + EthereumLikeTxsGetRequest, + EthereumLikeTxsMgetRequest, + EthereumLikeTxsSearchRequest, + EthereumLikeTx, +} from './repo/types'; + +type EthereumLikeTxsServiceGetRequest = ServiceGetRequest; +type EthereumLikeTxsServiceMgetRequest = ServiceMgetRequest; +type EthereumLikeTxsServiceSearchRequest = EthereumLikeTxsSearchRequest; + +export type EthereumLikeTxsService = { + get: Service>; + mget: Service< + EthereumLikeTxsServiceMgetRequest & WithMoneyFormat, + Maybe[] + >; + search: Service< + EthereumLikeTxsServiceSearchRequest & WithMoneyFormat, + SearchedItems + >; +}; diff --git a/src/services/transactions/updateAssetInfo/repo/index.ts b/src/services/transactions/updateAssetInfo/repo/index.ts index 5ee02f5d..0b22527d 100644 --- a/src/services/transactions/updateAssetInfo/repo/index.ts +++ b/src/services/transactions/updateAssetInfo/repo/index.ts @@ -6,7 +6,7 @@ import { mgetByIdsPreset } from '../../../_common/presets/pg/mgetByIds'; import { searchPreset } from '../../../_common/presets/pg/search'; import { Cursor, serialize, deserialize } from '../../_common/cursor'; -import { transformTxInfo } from '../../_common/transformTxInfo'; +import transformTxInfo from './transformTxInfo'; import { result as resultSchema } from './schema'; import * as sql from './sql'; diff --git a/src/services/transactions/updateAssetInfo/repo/schema.ts b/src/services/transactions/updateAssetInfo/repo/schema.ts index dad445ca..c5c29b07 100644 --- a/src/services/transactions/updateAssetInfo/repo/schema.ts +++ b/src/services/transactions/updateAssetInfo/repo/schema.ts @@ -1,16 +1,6 @@ import { Joi } from '../../../../utils/validation'; - -import commonFilters from '../../../_common/presets/pg/search/commonFilterSchemas'; import commonFields from '../../_common/commonFieldsSchemas'; -export const inputSearch = Joi.object() - .keys({ - ...commonFilters, - - assetId: Joi.string().assetId(), - }) - .nand('sender', 'senders'); - export const result = Joi.object().keys({ ...commonFields, diff --git a/src/services/transactions/updateAssetInfo/transformTxInfo.ts b/src/services/transactions/updateAssetInfo/transformTxInfo.ts deleted file mode 100644 index 7fdccbe7..00000000 --- a/src/services/transactions/updateAssetInfo/transformTxInfo.ts +++ /dev/null @@ -1,12 +0,0 @@ -import { compose } from 'ramda'; -import { renameKeys } from 'ramda-adjunct'; - -import { transformTxInfo } from '../_common/transformTxInfo'; - -export default compose( - transformTxInfo, - renameKeys({ - asset_name: 'name', - asset_id: 'assetId', - }) -); diff --git a/src/utils/db/knex/lib.js b/src/utils/db/knex/lib.js index 2274712e..714dd0cf 100644 --- a/src/utils/db/knex/lib.js +++ b/src/utils/db/knex/lib.js @@ -5,21 +5,25 @@ const hasMethod = curryN( (method, x) => (x && x[method] && type(x[method]) === 'Function') || false ); -const createPointfree = method => (...args) => { - const instanceIdx = findIndex(hasMethod(method), args); +const createPointfree = + (method) => + (...args) => { + const instanceIdx = findIndex(hasMethod(method), args); - if (instanceIdx !== -1) { - return args[instanceIdx].clone()[method](...slice(0, instanceIdx, args)); - } else { - return (...args2) => createPointfree(method)(...concat(args, args2)); - } -}; + if (instanceIdx !== -1) { + return args[instanceIdx].clone()[method](...slice(0, instanceIdx, args)); + } else { + return (...args2) => createPointfree(method)(...concat(args, args2)); + } + }; module.exports = { hasMethod, where: createPointfree('where'), whereIn: createPointfree('whereIn'), whereRaw: createPointfree('whereRaw'), + whereNull: createPointfree('whereNull'), + whereNotNull: createPointfree('whereNotNull'), limit: createPointfree('limit'), orWhere: createPointfree('orWhere'), orderBy: createPointfree('orderBy'),