Getting Started

Lets start with a simple notebook.


  • Go to TIR AI Platform

  • Create a New Project. Enter a suitable name for your project (e.g. sentinel)

  • Visit Notebooks tab and Click on Create a Notebook

  • Enter a name for notebook, if desired.

  • Choose a notebook image Pytorch 2

  • Choose a machine plan - CPU3.8_Free. Leave rest of the fields as default

  • Click CREATE

  • Wait for the notebook to come to a running state. Click on refresh icon in status column to monitor status.

  • When notebook is in running state, either click on the notebook name, or the three dots (…) to launch the notebook. The project explorer (left sidebar) also displayes quick launch links for your notebooks.

You will see a new window or tab open up in browser. Visit the page and you will find jupyter notebook ready for your use. When you do, you are all set to create magic.


Now that you are comfortable with notebooks, lets look at creating a dataset.

  1. Go to the TIR AI Platform

  2. Create a new project or select an existing one

  3. Go to Datasets tab

  4. Click Create Dataset

  5. Choose a bucket type New EOS Bucket. This will create a new EOS bucket tied to your account and also access keys for it.

  6. Enter a name for your dataset (for e.g. paws)

  7. Click on CREATE

  8. Note down the Bucket name, Access Key and Secret Key. You will need them to upload data later on. An easiser approach would be to copy the mc command from setup minio cli tab and paste it on your command line.

  9. Wait for the dataset to come to a ready state. Click on the refresh icon in status column to monitor the progress

  10. When the dataset is ready, locate and click on the your dataset row.

  11. You should see two tabs Details and Objects at the bottom of the page

  12. Click on Objects tab

  13. Upload any files of your choice here. Though this is easier option to use, we recommend using mc (minio CLI) for larger datasets

Using Datasets with Notebooks

  1. Go back to Notebooks tab

  2. Create a New Notebook, enter all the inputs in the form. In dataset, select the dataset we jsut created (e.g. paws)

  3. Click CREATE

  4. When notebook is ready, launch jupyter labs.

  5. Enter the following command in jypter notebook cell and run:

    ls /datasets/

If all went well, you should see your dataset name in the result. If you go further in the directory, you will see files that you uploaded from objects tab.

Model and Model Endpoints

In this example, we will deploy a torch serve based model but TIR also supports other frameworks like triton (ONNX, pytorch serve, tensorflow, etc), tf serve, etc. You can find more details in Models section.

  1. Go to Models tab

  2. Click Create Model

  3. Enter a model name and click CREATE to generate EOS credentials and setup commands

  4. Use the mc command to configure Minio CLI on your notebook (hosted or local), or local desktop.

  5. Run the following command to confirm the setup works: .. code:

    mc ls <model-name>/<model-eos-bucket>
  6. Torch serve requires a model archive to serve the api. For this step you can use your own model archive or download this mnist archive

  7. If you have downloaded mnist archive from the link in prior step then unzip the archive

  8. Upload model archive (must include, model-store directory) to your model EOS bucket (see step 5).

  9. The list the objects from your model EOS bucket (use mc ls) and ensure the structure is similar to below.

    ├── config
        │ ├──
    ├── model-store
        │ ├── mnist.mar
  10. Now that the model store is ready, go to Model Endpoints section.

  11. Click Create Endpoint

  12. Select the model name (same as created in step 3)

  13. Select the model format as pytorch

  14. Create the model end point.

  15. Use the instructions (e.g. curl command) in model endpoint to test the model. If you are using our mnist model, then use this tensor input file for testing.

    curl -k -H 'Authorization: Bearer $AUTH_TOKEN' -X POST<projectid>/endpoint/<endpoint-id>/v1/models/mnist:predict -d @./mnist_request.json

    Note: The model endpoints follow kserve inference protocol. For more details on kserve website.