Skip to main content

Data Upload

The first step of using the Inventory Optimization API, is uploading historical sales data. As mentioned previously in the documentation, each stock item that the end-customer (tenant) has in their inventory is a separate dataset. See the snippet from the API reference below for the schema that the datasets need to follow. Data is uploaded for one tenant at the time, but you can upload multiple datasets from the same tenant in the same data upload job.

Historical Sales Orders Data Schema
object

Details for the webhook endpoint to call when a job finishes

required
Array of objects (Raw Data Upload Request Datasets) non-empty

The datasets containing raw data for the sales trainer

{
  • "webhook": {
    },
  • "datasets": [
    ]
}

Keeping Data Up to Date

The fact that the forecast always will be at least equal to the sales recorded so far in the current time step before the current date, combined with the purchase order suggestions relying on accurate data for the planning period, clients should be incentivized to upload data on a daily basis. This could be a scheduled job running nightly. Note that if there is a gap between the current date, i.e. the date this endpoint is called, and the last date in the dataset, we will assume that no sales have occurred in that period. If sales have actually happened, the purchase order suggestions will not be able to reflect this.

Also, partners must make sure to only upload incremental changes to already uploaded datasets. Do not run a full upload of all transactions in a dataset in scheduled jobs, only the incremental changes since the last upload.

Data Policy

We store the data that clients upload so you don't have to upload data every time predictions are needed. Read about our data policy here.

Step-by-Step Instructions

  1. Create a JSON file containing the datasets for the tenant.
    • Make sure the JSON adheres to the request body of PUT /[presigned_url].
  2. Call GET /presigned_url with the clientId and tenantId in the header.
    • The endpoint returns a presigned URL and a jobId.
  3. Call PUT /[presigned_url] by inserting the URL from the previous step.
  4. Do one of the following...
    • Call GET /status with the tenantId and jobId in the header until the status is “success”.
    • Provide a webhook in the body of the PUT request in step 3 to receive a request when the job is finished running.
  5. If any datasets are invalid, the job status will be “invalid” and the invalid datasets will have their statuses set to “invalid”. In this case, no data will be stored. Hopefully, the error message in the status explains properly what went wrong. If not, get in touch with us and we'll help you.

Error Messages