There are a few options available in order to design automated checks for the account`s data integration status.
By the SYNC REQUEST response
By an endpoint check
SYNC Request Response
If desired, your process can read the response from the SYNC REQUEST. The response will be of type JSON:
in case of error: the message
in case of success: a combination key-value: {“taskId”:”task value”}. with the <taskId> you can make request to <endpoint>/task/status/<taskId>
Account review (WEB)
You can review the status of the triggered events in the Account page: <endpoint>/account like :
https://boxalino-di-process-krceabfwya-ew.a.run.app/account
Account Review (CLI)
Endpoint | https://boxalino-di-process-krceabfwya-ew.a.run.app/account/review | ||
---|---|---|---|
1 | Method | POST | |
2 | Headers | Content-Type | application/json |
3 | Body | key | DATASYNC API key |
4 |
| client | account name |
5 | limit | number of logs (ordered by most recent) | |
6 |
| index | dev / prod (default: none) |
7 |
| mode | D for delta , I for instant update, F for full |
8 |
| type | product, user, content, user_content, order (default: none) |
9 | status | SYNC - sync requests and the status |
For example, this request will return the last SYNC OK (succesfull sync request):
curl https://boxalino-di-process-krceabfwya-ew.a.run.app/account/review \ -X POST \ -d "{\n \"client\": \"BOXALINO_ACCOUNT\",\n \"key\": \"BOXALINO_ACCOUNT_ADMIN_KEY\",\n \"index\": \"prod\",\n \"mode\": \"F\",\n \"type\": \"product\",\n \"status\": \"none\",\n \"limit\": 6\n}" \ -H "Content-Type: application/json"
The API response for a request for status: “SYNC OK”, limit:1 - would be a JSON list, like:
[ { "ID": "UUID-FOR-THE-SYNC-REQUEST", "RequestReceivedAt": "Y-m-d H:i:s", "Status": "SYNC OK", "Message": null, "Timestamp": "YmdHis, "VersionTs": "TIME-IN-UNIX-MS", "Project": null, "Dataset": null, "Document": null, "Default": "[]" } ]
Data Integration Statuses
Status Code | Meaning |
---|---|
STATUS CODES DURING SYNC REQUESTS | |
SYNC OK | the data content was exported to SOLR; |
SYNC FAIL | the data update failed in SOLR or before |
SYNC REQUEST | a SYNC REQUEST was done (<endpoint>/sync); once the SYNC REQUEST is received, the compute process starts |
SYNCPRODUCT FAIL | fail of SYNC REQUEST during compute |
STOP SYNC | the SYNC REQUEST was stopped (for ex: the content quota was not reached: min X products to be synced, OR the doc_X table is empty) |
BIG SOLR CONTENT | generating the SOLR file for SOLR export from the doc_X_<mode>_<tm> file |
SOLRSYNC REQUEST | exporting the solr-compute file (above) to SOLR for sync |
DISPATCHED SYNC REQUEST | the BQ compute process log (for dispatched requests) |
RESYNCACCOUNT REQUEST | re-sync request (triggered internally, on client request); |
SYNCCHECK OK | a synccheck request was done (<endpoint>/sync/check); this is done for ex for D/I to access the last SYNC OK status for the account and type |
FAIL AUTH | authentication headers are invalid / not a match for the account |
FAIL SOLR EXPORT | the export of the generated file failed (data index not updated) |
STATUS CODES DURING LOAD REQUESTS | |
LOAD OK | the doc_X data structure was loaded succesfully in BQ |
LOAD REQUEST | a LOAD REQUEST was received (<endpoint>/load); once the LOAD REQUEST is received it: |
FAIL BQ LOAD | BQ load step failed |
LOADBYCHUNK REQUEST | a LOAD BY CHUNK request was received. when this happens - it loads the content in a GCS file (doc_<type>_<mode>_<tm>-<chunk>.json) |
LOADBYCHUNK OK | the GCS file (doc_<type>_<mode>_<tm>-<chunk>.json) was created |
LOADBYCHUNK FAIL | the GCS file was not properly loaded |
FAIL GCS | the GCS bucket / content failed to generate |
LOADBQ REQUEST | loads all the chunk files in BQ |
LOADBQ OK | succesfully loaded the doc_<type> content in BQ; |
LOADBQ FAIL | the BQ table was unable to generate based on the available doc_<type>_<mode>_<tm>-*.json content |
Add Comment