Ad-hoc File Uploads

Ad-hoc or One-Time upload is the feature that allows you to manually update files to replace table data. This would be helpful when you need to replace reference data. ByteHouse Web Interface provides a convenient wizard to perform such tasks. Under the hood, ByteHouse loads ad-hoc files using the data loading jobs, but it simplifies the process by combining job creation and job execution.

The following file types are supported for ad-hoc upload:

  • CSV
  • JSON
  • Excel
  • Avro
  • Parquet
    Currently, we limit the local uploaded file size to be less than 40 MB.

Create an Ad-hoc job

  1. Access ad-hoc upload functionality from data import -> New Import Job
  1. Select File Upload job from the pop-up
  1. Select a local file to upload, and you may drag & drop local files.
  1. There are a few options for you to further specify the format of the file, you can also choose analyse from file feature to generate schema. Once the schema is generated, you can further adjust the schema.
    For the columnar formats such as CSV or Excel, if the file doesn't have a header, the column names will be in _cX format. There are also sample values for each column, which is helpful to intuit the columns.
  1. Select an existing table to load the file to. You can create a new table when the table is not presented yet. We allow you to customize the columns to be loaded, you can ignore columns that you wish to drop from the source.
  1. You can create a name for this ad-hoc upload. After done, click the create button to start file loading.

View and Edit an Ad-hoc job

You can find the file upload job on the Job History page, in the job detailed page, you can also start an execution to upload a new file with the same configuration. Please take note that for each execution under the same job, the table schema and the file format of uploaded file should be the same.

If you want to edit the configuration of a job, you can click edit button in the job detailed page and update the column information as well as target DB/table.

For troubleshooting, you can go into the execution detailed page and check the loading status and error log for more information.


Did this page help you?