Pipeline
When creating a Job you'll notice that you can add more than one Action in the Job. The Job acts much like a pipeline where you can pipe the results from one Job into the next. This just requires some extra consideration when developing the Action.
Concept
Each time an Action completes, it creates the results. The BrazenAgent stores this as a zip file and supplies it upon request to a subsequent Action, a Service, or an interactive request for it using the BrazenCloud utility download command. However, behind the scenes they all are using the download command.
When you run the download command while in the context of a Job, the BrazenCloud utility contextually knows what to collect. All that needs to happen in the Action is to execute the download command and then extract the contents of the downloaded zip file, which ends up in the .\download folder by default.
PowerShell Example
In PowerShell, you can do do this with Start-Process and then, in PowerShell 5+, use Expand-Archive.
$settings = Get-Content .\settings.json | ConvertFrom-Json
.\runway.exe -N -S $settings.host download --directory $outFolder
$zipFiles = Get-ChildItem .\download\*.zip
foreach ($zip in $zipFiles) {
Expand-Archive -Path $zip.FullName -DestinationPath .\download\extracted
}Python Example
In Python, you can do this with subprocess.Popen(), glob.glob(), and zipfile.extractall():
import json
import subprocess
import glob
import zipfile
settings_file = open(".\\settings.json", "r")
settings = json.load(settings_file)
runway_proc = subprocess.Popen(["runway.exe", "-N", "-S", host, "download"])
runway_proc.wait()
all_zip_files = glob.glob(".\\download\\*.zip");
for zip_file in all_zip_files:
input_zipfile = ".\\download\\" + zip_file
with zipfile.ZipFile(input_zipfile, 'r') as zip_ref:
zip_ref.extractall(".\\download\\extracted")Next
Now you can do whatever you please with the data that you just downloaded. If you downloaded some JSON from a previous Action, you can format it to upload it into your ElasticSearch database.
Last updated