I would like to download files that are produced by multiple jobs into one folder.
Here is a schema of what I'd like to accomplish:
jobs:
- job: job1
pool: {vmImage: 'Ubuntu-16.04'}
steps:
- bash: |
printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1
- task: PublishPipelineArtifact@1
inputs:
targetPath: $(Pipeline.Workspace)/file.1
- job: job2
pool: {vmImage: 'Ubuntu-16.04'}
steps:
- bash: |
printf "Hello form job2\n" > $(Pipeline.Workspace)/file.2
- task: PublishPipelineArtifact@1
inputs:
targetPath: $(Pipeline.Workspace)/file.2
- job: check_prev_jobs
dependsOn: "all other jobs"
pool: {vmImage: 'Ubuntu-16.04'}
steps:
- bash: |
mkdir -p $(Pipeline.Workspace)/previous_artifacts
- task: DownloadPipelineArtifact@2
inputs:
source: current
path: $(Pipeline.Workspace)/previous_artifacts
Where the directory $(Pipeline.Workspace)/previous_artifacts only contains file.1 and file.2 and does not have directories job1 and job2 that contain /file.1 and /file.2 respectively.
Thanks!