Skip to content

Clarify handling of workspace paths #545

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: draft
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- `filter_bbox`, `load_collection`, `load_stac`: Clarified that the bounding box is reprojected to the CRS of the spatial data cube dimensions if required.
- `filter_spatial`: Clarified that masking is applied using the given geometries. [#469](https://github.com/Open-EO/openeo-processes/issues/469)
- `load_collection` and `load_stac`: Clarified that scale and offset are not applied automatically when loading the data. [#503](https://github.com/Open-EO/openeo-processes/issues/503)
- `load_uploaded_files` and `run_udf`: Clarify handling of file paths and added `FileNotFound` exception. [#461](https://github.com/Open-EO/openeo-processes/issues/461)
- `mod`: Clarified behavior for y = 0
- `sqrt`: Clarified that NaN is returned for negative numbers.
- Clarify allowed `FeatureCollection` geometries in `load_collection`, `mask_polygon`, `apply_polygon`, and `load_stac` [#527](https://github.com/Open-EO/openeo-processes/issues/527)
Expand Down
8 changes: 4 additions & 4 deletions proposals/export_workspace.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"id": "export_workspace",
"summary": "Export data to a cloud user workspace",
"description": "Exports the given processing results made available through a STAC resource (e.g., a STAC Collection) to the given user workspace. The STAC resource itself is exported with all STAC resources and assets underneath.",
"summary": "Export data to a cloud workspace",
"description": "Exports the given processing results made available through a STAC resource (e.g., a STAC Collection) to the given cloud workspace. The STAC resource itself is exported with all STAC resources and assets underneath.",
"categories": [
"export",
"stac"
Expand All @@ -10,15 +10,15 @@
"parameters": [
{
"name": "data",
"description": "The data to export to the user workspace as a STAC resource.",
"description": "The data to export to the cloud workspace as a STAC resource.",
"schema": {
"type": "object",
"subtype": "stac"
}
},
{
"name": "workspace",
"description": "The identifier of the workspace to export to.",
"description": "The identifier of the cloud workspace to export to.",
"schema": {
"type": "string",
"pattern": "^[\\w\\-\\.~]+$",
Expand Down
113 changes: 58 additions & 55 deletions proposals/load_uploaded_files.json
Original file line number Diff line number Diff line change
@@ -1,55 +1,58 @@
{
"id": "load_uploaded_files",
"summary": "Load files from the user workspace",
"description": "Loads one or more user-uploaded files from the server-side workspace of the authenticated user and returns them as a single data cube. The files must have been stored by the authenticated user on the back-end currently connected to.",
"categories": [
"cubes",
"import"
],
"experimental": true,
"parameters": [
{
"name": "paths",
"description": "The files to read. Folders can't be specified, specify all files instead. An exception is thrown if a file can't be read.",
"schema": {
"type": "array",
"subtype": "file-paths",
"items": {
"type": "string",
"subtype": "file-path",
"pattern": "^[^\r\n\\:'\"]+$"
}
}
},
{
"name": "format",
"description": "The file format to read from. It must be one of the values that the server reports as supported input file formats, which usually correspond to the short GDAL/OGR codes. If the format is not suitable for loading the data, a `FormatUnsuitable` exception will be thrown. This parameter is *case insensitive*.",
"schema": {
"type": "string",
"subtype": "input-format"
}
},
{
"name": "options",
"description": "The file format parameters to be used to read the files. Must correspond to the parameters that the server reports as supported parameters for the chosen `format`. The parameter names and valid values usually correspond to the GDAL/OGR format options.",
"schema": {
"type": "object",
"subtype": "input-format-options"
},
"default": {},
"optional": true
}
],
"returns": {
"description": "A data cube for further processing.",
"schema": {
"type": "object",
"subtype": "datacube"
}
},
"exceptions": {
"FormatUnsuitable": {
"message": "Data can't be loaded with the requested input format."
}
}
}
{
"id": "load_uploaded_files",
"summary": "Load files from the user workspace",
"description": "Loads one or more user-uploaded files from the server-side workspace of the authenticated user and returns them as a single data cube. The files must have been stored by the authenticated user on the back-end currently connected to.",
"categories": [
"cubes",
"import"
],
"experimental": true,
"parameters": [
{
"name": "paths",
"description": "The files to read. Folders can't be specified, specify all files instead. An exception is thrown if a file can't be read.\n\nFile paths are relative to the file workspace of the user. The workspace is the root folder, i.e. the paths `/folder/file.txt` and `folder/file.txt` and `./folder/file.txt` are all equivalent. Specifying paths outside of the workspace is not allowed and throws a `FileNotFound` exception.",
"schema": {
"type": "array",
"subtype": "file-paths",
"items": {
"type": "string",
"subtype": "file-path",
"pattern": "^[^\r\n\\:'\"]+$"
}
}
},
{
"name": "format",
"description": "The file format to read from. It must be one of the values that the server reports as supported input file formats, which usually correspond to the short GDAL/OGR codes. If the format is not suitable for loading the data, a `FormatUnsuitable` exception will be thrown. This parameter is *case insensitive*.",
"schema": {
"type": "string",
"subtype": "input-format"
}
},
{
"name": "options",
"description": "The file format parameters to be used to read the files. Must correspond to the parameters that the server reports as supported parameters for the chosen `format`. The parameter names and valid values usually correspond to the GDAL/OGR format options.",
"schema": {
"type": "object",
"subtype": "input-format-options"
},
"default": {},
"optional": true
}
],
"returns": {
"description": "A data cube for further processing.",
"schema": {
"type": "object",
"subtype": "datacube"
}
},
"exceptions": {
"FormatUnsuitable": {
"message": "Data can't be loaded with the requested input format."
},
"FileNotFound": {
"message": "The specified file does not exist."
}
}
}
5 changes: 4 additions & 1 deletion run_udf.json
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@
"pattern": "^https?://"
},
{
"description": "Path to a UDF uploaded to the server.",
"description": "Path to a UDF uploaded to the server.\n\nFile paths are relative to the file workspace of the user. The workspace is the root folder, i.e. the paths `/folder/file.txt` and `folder/file.txt` and `./folder/file.txt` are all equivalent. Specifying paths outside of the workspace is not allowed and throws a `FileNotFound` exception.",
"type": "string",
"subtype": "file-path",
"pattern": "^[^\r\n\\:'\"]+$"
Expand Down Expand Up @@ -91,6 +91,9 @@
},
"InvalidVersion": {
"message": "The specified UDF runtime version is not supported."
},
"FileNotFound": {
"message": "The specified file does not exist."
}
},
"returns": {
Expand Down