Last date modified: 2025-Jun-17
Import documents
You can use the Import Service API to add documents to a workspace programmatically.
On GitHub, you can find can comprehensive samples illustrating how to import native documents, images, objects, and productions. For additional code samples, see the Import Service API samples repository.
As of September 1, 2024, we’ve streamlined our Staging boundaries to more effectively separate staging data from workspace and system data. With this change, you will no longer be able to write to or access data outside of the four defined staging area folders. The four folders that comprise the Staging area are ARM, ProcessingSource, StructuredData, and TenantVM. Folders that were removed include FTA, Temp, Export, and dtSearch, in addition to any other folders that you manually created. Refer to the
Considerations
- When setting up the data source using Import.SDK, ensure that the path to the load file adheres to Staging Governance guidelines. The load file should be placed within the StructuredData\Import directory.
- For fields that contain file paths (such as Extracted Text or Native File Path), Import.SDK only supports the import of files located in the main folder of the load file or in subfolders within that same directory. This applies to both relative and absolute paths. For example, Load file location: \\files\T000\StructuredData\Import\Asia\loadfile.dat
| Supported file path and locations | File path in load file - Absolute | File path in load file - Relative |
|---|---|---|
| \\files\T000\StructuredData\Import\Asia\sample5.pdf | \\files\T000\StructuredData\Import\Asia\sample5.pdf | “.\sample5.pdf” |
| \\files\T000\StructuredData\Import\Asia\001\documentation.pdf | \\files\T000\StructuredData\Import\Asia\001\documentation.pdf | “.\001\documentation.pdf” |
Create an import job
The following code sample illustrates how to create an import job entity in a particular workspace. The job is defined by its unique ID generated by the user and provided in the request.
1
2
3
4
5
6
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/e694ad62-198d-4ecb-936d-1862ddfa4235'
-H 'X-CSRF-Header: -'
-d '{
"applicationName": "simpleImportDocuments",
"correlationID": "c0r31ati0n_ID"
}'
Create an import job configuration
The following code sample illustrates how to configure an existing import job by defining sets of significant parameters; import type, mode, and field mappings.
1
2
3
curl -X POST \'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/e694ad62-198d-4ecb-936d-1862ddfa4235/documents-configurations/'
-H 'X-CSRF-Header: -'
-d "$importSettingsPayloadJson"
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
{
"importSettings": {
"Overlay":null,
"Native":{
"FilePathColumnIndex": "22",
"FileNameColumnIndex": "13"
},
"Image": null,
"Fields": {
"FieldMappings": [
{
"ColumnIndex": 0,
"Field": "Control Number",
"ContainsID": false,
"ContainsFilePath": false
},
]
},
"Folder": {
"RootFolderID": 1003663,
"FolderPathColumnIndex": 2
}
}
}'
1
2
3
4
5
6
7
8
9
10
11
ImportDocumentSettings importSettings = ImportDocumentSettingsBuilder.Create()
.WithAppendMode()
.WithNatives(x => x
.WithFilePathDefinedInColumn(filePathColumnIndex)
.WithFileNameDefinedInColumn(fileNameColumnIndex))
.WithoutImages()
.WithFieldsMapped(x => x
.WithField(controlNumberColumnIndex, "Control Number")
.WithFolders(f => f
.WithRootFolderID(rootFolderId, r => r
.WithFolderPathDefinedInColumn(folderPathColumnIndex)));
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
ImportDocumentSettings importSettings = new ImportDocumentSettings()
{
Overlay = null,
Native = new NativeSettings
{
FileNameColumnIndex = fileNameColumnIndex,
FilePathColumnIndex = filePathColumnIndex,
},
Fields = new FieldsSettings
{
FieldMappings = new[]
{
new FieldMapping
{
Field = "Control Number",
ContainsID = false,
ColumnIndex = 0,
ContainsFilePath = false,
},
},
},
Folder = new FolderSettings
{
FolderPathColumnIndex = folderPathColumnIndex,
RootFolderID = 1003663,
},
Other = null,
};
Add data source
The following code sample illustrates how to create data source entities for particular import jobs. It represents the configuration that corresponds to data sets being imported. The data source is identified by its unique ID generated by the user and provided in the request. Data source configuration includes the path to the load file and other significant parameters telling how data in the load file will be read and interpreted by the system.
You can add many data sources to the same import job. Data sources can be added both before a job is started or after. You can add additional sources to running import jobs.
1
2
3
4
5
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/e694ad62-198d-4ecb-936d-1862ddfa4235/sources/0cb922a2-8df4-42fd-9429-c241410a0d1e'
-H 'X-CSRF-Header: -' \
-H 'Content-Type: application/json'
-d "$dataSourceSettingsJson"
}'
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
{
"dataSourceSettings": {
"Path": "C:\\DefaultFileRepository\\samples\\load_file.dat",
"FirstLineContainsColumnNames": true,
"StartLine": 1,
"ColumnDelimiter": "|",
"QuoteDelimiter": "^",
"NewLineDelimiter": "#",
"MultiValueDelimiter": ";",
"NestedValueDelimiter": "&",
"EndOfLine" = 0
Encoding" = null
"CultureInfo" : "en-US",
"Type": 2
}
}'
1
2
3
4
5
6
7
8
9
10
11
12
13
DataSourceSettings dataSourceSettings = DataSourceSettingsBuilder.Create()
.ForLoadFile("C:\\DefaultFileRepository\\samples\\load_file.dat")
.WithDelimiters(d => d
.WithColumnDelimiters('|')
.WithQuoteDelimiter('^')
.WithNewLineDelimiter('#')
.WithNestedValueDelimiter('&')
.WithMultiValueDelimiter(';'))
.WithFirstLineContainingHeaders()
.WithEndOfLineForWindows()
.WithStartFromBeginning()
.WithDefaultEncoding()
.WithDefaultCultureInfo();
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
DataSourceSettings dataSourceSettings = new DataSourceSettings
{
Type = DataSourceType.LoadFile,
Path = "C:\\DefaultFileRepository\\samples\\load_file.dat",
NewLineDelimiter = '#',
ColumnDelimiter = '|',
QuoteDelimiter = '^',
MultiValueDelimiter = ';',
NestedValueDelimiter = '&',
Encoding = null,
CultureInfo = "en-us",
EndOfLine = DataSourceEndOfLine.Windows,
FirstLineContainsColumnNames = true,
StartLine = 0,
};
Begin import job
The following code sample illustrates how to start an import job using begin. Begin allows you to schedule importing data to a workspace. A begun job does not mean that data is instantly imported. However, data sources are added to the queue and scheduled. The import job state or data source state shows the current stage of the import job.
1
2
3
curl -X POST 'https://relativity-host/Relativity.REST/api/import-service/v1/workspaces/10000/import-jobs/e694ad62-198d-4ecb-936d-1862ddfa4235/begin/'
-H 'X-CSRF-Header: -'
-d ''
End import job
The following code sample illustrates how to end an import job that has already started. It is highly recommended to add the code if no more data sources are planned to be added. All data sources added to the job before the end request is sent will be imported.