Cheat-sheet
The following page provides you with high level information that can be useful when running the self-managed solution of identity analytics, a cheat sheet of sorts.
Default credentials
Upon first login the default credentials are:
- user/login:
setup
- password:
Brainwave@2023
Hostname
The hostname
MUST be lowercase.
The hostname configuration can be changed using the brainwave config --hostname demo.local
command
Change demo.local
to match your environment's configuration
TLS
To toggle TLS
:
- Enable:
brainwave config --tls
- Disabled:
brainwave config --tls=false
The certificate names (in /etc/brainwave
when in server mode) must match your hostname.
Local access and hosts file
If you are running a demonstration or sandbox environment, add the hostname you configured to the /etc/hosts file
, for example:
127.0.0.1 demo.local
Accessing the portal from that environment can be done using http://demo.local
or https://demo.local
depending on your TLS
settings.
Data Location
Please use the following link for more information on the folders (paths) used by the self-managed solution:
Update / upgrade
To check for updates, run brainwave status
.
If an update is available it will show up in the console with a !!
new to the Latest available version
:
XXXX@XXXX:~$ brainwave status
Installation mode: Server
Project name: sandbox
Client version: 1.2.21
Installed version: 1.2.153
Latest available version: 1.2.183 ‼
Registry: igrcanalytics.azurecr.io
Git configuration: Valid √
Images: All present √
Services Stopped ‼
To install the update, run brainwave admin upgrade
for example:
XXXX@XXXX:~$ brainwave admin upgrade
Finding latest 1.2 version
● Will upgrade to version 1.2.183
Proceed with upgrade [y/n]: y
√ Remove containers [Complete]
√ Run container bwresources [Complete]
√ Upgraded application to version 1.2.183
Then run brainwave pull
.
Importfiles management
Source files are located here:
/var/lib/brainwave/sourcefiles
: files uploaded via the portal, or generated by connectors/var/lib/brainwave/importfiles
: files from thesourcefiles
folder are copied here and modified if needed for the data ingestion (timeslot)
The /var/lib/brainwave/importfiles
folder is mapped as a docker volume, projectName_bwimportfiles
.
This volume is mounted in the bwbatch
container as /data/importfiles
.
This is why in the docker.configuration
, the importfiles
variable points to /data/importfiles
It's content should be the same as /var/lib/brainwave/importfiles
To use external data sources (not extracted via integrated connectors), you will need to use the bw_data_collector.ps1 script.
You can configure it to upload each needed folder and files.
After doing this you will need to create a Generic bridge - file
datasource.
This will simply copy the files that got uploaded to /var/lib/brainwave/sourcefiles
into /var/lib/brainwave/importfiles
where they will be ingested by your pre configured collects.
Keep in my that you will not be able to use {config.projectPath}/importfiles
, instead you should use a variable containing /data/importfiles
.
CLI documentation generation
It is possible to generate the documentation of Brainwave CLI in Markdown format.
To do so use the command brainwave gendoc
:
PS ~> brainwave gendoc --help
Generate documentation
Usage:
brainwave gendoc [flags]
Flags:
-h, --help help for gendoc
-o, --output string Output directory (default "./doc")
Global Flags:
--console-logs Also show logs on the console
-d, --debug count Increase log level
--no-color Turn off colorization
-q, --quiet count Decrease log level
This provides you with the full list of available commands.