Skip to main content
Skip table of contents

Netskope

Overview

Netskope Security provides unrivaled visibility and real-time data and threat protection. The integration with Netskope helps users to manage events and alerts.

D3 SOAR is providing REST operations to function with Netskope.

Netskope is available for use in:

D3 SOAR

V12.7.83.0+

Category

Network Security

Deployment Options

Option II, Option IV

Connection

To connect to Netskope from D3 SOAR, please follow this part to collect the required information below:

Parameter

Description

Example

Server URL

The server URL of the Netskope.

https://alliances.goskope.com

Token

The token used for Netskope to establish the API connection.

c4*****a8

API Version

The version of the API to use for the connection.

v1

Configuring Netskope to Work with D3 SOAR

  1. Login to Netskope with your credentials.

  2. Find your app, then navigate to Settings.

  3. Then click Tools.

  4. Click REST API v1, then choose to GENERATE NEW TOKEN or click SHOW to view the existing token.

Configuring D3 SOAR to Work with Netskope

  1. Log in to D3 SOAR.

  2. Find the Netskope integration.

    1. Navigate to Configuration on the top header menu.

    2. Click on the Integration icon on the left sidebar.

    3. Type Netskope in the search box to find the integration, then click it to select it.

    4. Click + New Connection, on the right side of the Connections section. A new connection window will appear.

  3. Configure the following fields to create a connection to Netskope.

    1. Connection Name: The desired name for the connection.

    2. Site: Specifies the site to use the integration connection. Use the drop-down menu to select the site. The Share to Internal Sites option enables all sites defined as internal sites to use the connection. Selecting a specific site will only enable that site to use the connection.

    3. Recipient site for events from connections Shared to Internal Sites: This field appears if you selected Share to Internal Sites for Site to let you select the internal site to deploy the integration connection.

    4. Agent Name (Optional): Specifies the proxy agent required to build the connection. Use the dropdown menu to select the proxy agent from a list of previously configured proxy agents.

    5. Description (Optional): Add your desired description for the connection.

    6. Tenant (Optional): When configuring the connection from a master tenant site, you have the option to choose the specific tenant sites you want to share the connection with. Once you enable this setting, you can filter and select the desired tenant sites from the dropdowns to share the connection.

    7. Configure User Permissions: Defines which users have access to the connection.

    8. Active: Check the tick box to ensure the connection is available for use.

    9. System: This section contains the parameters defined specifically for the integration. These parameters must be configured to create the integration connection.
      1. Input your domain level Server URL.
      2. Copy the Token from the Netskope platform. Refer to step 5 of Configuring Netskope to Work with D3 SOAR.
      3. Input the API Version. The default value is v1.

    10. Connection Health Check: Updates the connection status you have created. A connection health check is done by scheduling the Test Connection command of this integration. This can only be done when the connection is active.
      To set up a connection health check, check the Connection Health Check tickbox. You can customize the interval (minutes) for scheduling the health check. An email notification can be set up after a specified number of failed connection attempts.

    11. Enable Password Vault: An optional feature that allows users to take the stored credentials from their own password vault. Please refer to the password vault connection guide if needed.

  4. Test the connection.

    1. Click Test Connection to verify the account credentials and network connection. If the Test Connection Passed alert window appears, the test connection is successful. You will see Passed with a green checkmark appear beside the Test Connection button. If the test connection fails, please check your connection parameters and try again.

    2. Click OK to close the alert window.

    3. Click + Add to create and add the configured connection.

Commands

Netskope includes the following executable commands for users to set up schedules or create playbook workflows. With the Test Command, you can execute these commands independently for playbook troubleshooting.

Integration API Note

For more information about the Netskope API, please refer to the Netskope API reference.

Note for Time-related parameters

The input format of time-related parameters may vary based on your account settings. As a result, the sample data provided in our commands is different from what you see. To set your preferred time format, follow these steps:

  1. Navigate to Configuration > Application Settings. Select Date/Time Format.

  2. Choose your desired date and time format.

After that, you will be able to view your preferred time format when configuring the DateTime input parameters for commands.

Fetch Event

Returns events or alerts generated by Netskope with the specified search condition.

Reader Note

Please ensure that the value of the Type parameter aligns with the Is Alert parameter. In other words, if the selected type pertains to an alert, the Is Alert parameter must be set to True. Otherwise, errors will be returned.

Input

Input Parameter

Required/Optional

Description

Example

Start Time

Required

The start time of the time range to fetch events or alerts after the specified timestamp, in UTC time.

2021-07-07 00:00

End Time

Required

The end time of the time range to fetch events or alerts before the specified timestamp, in UTC time.

2021-07-08 00:00

Number of Event(s) Fetched

Optional

The maximum number of events or alerts to return. The default value is 50.

50

Search Condition

Optional

The query string defining the search condition in the following format: Field Operator 'Value'. The Value should be enclosed in single quotation marks (e.g., app eq 'Google Gmail'). For detailed syntax information, please refer to the Netskope API documentation available at: https://alliances.goskope.com/docs/Netskope_Help/en/get-alerts-data.html

app eq 'Google Gmail'

Type

Required

The desired alert or event type for filtering. The available event types include Page, Application, Audit, Infrastructure, and Network. For alerts, choose from the following valid options: Anomaly, Compromised Credential, Policy, Legal Hold, Malsite, Malware, DLP (Data Loss Prevention), Security Assessment, Watchlist, Quarantine, and Remediation.

Page

Is Alert

Optional

The option to indicate whether the command fetches alerts. The available options are True or False, with the default setting as False.

False

Output

Raw Data

The primary response data from the API request.

D3 customizes the returned raw data by adding "isAlert" and "utcTime" fields.

SAMPLE DATA

JSON
{
  "status": "success",
  "msg": "",
  "data": [
      {
          "access_method": "API Connector",
          "type": "nspolicy",
          "category": "IaaS/PaaS",
          "alert": "yes",
          "alert_type": "Security Assessment",
          "activity": "Introspection Scan",
          "action": "alert",
          "app": "Google Cloud Platform",
          "instance_id": "***@example.com",
          "region_id": "us-central1",
          "region_name": "Council Bluffs, Iowa, USA",
          "sa_rule_id": -*****,
          "sa_rule_name": "Remote access: Ensure \"Block Project-wide SSH keys\" enabled for VM instances",
          "compliance_standards": [
              {
                  "id": -*****,
                  "standard": "CSA-CCM-3.0.1",
                  "section": "IAM",
                  "control": "04",
                  "reference_url": "https://cloudsecurityalliance.org/research/cloud-controls-matrix/",
                  "description": "Identity & Access Management: Policies and Procedures | Policies and procedures shall be established to store and manage identity information about every person who accesses IT infrastructure and to determine their level of access. Policies shall also be developed to control access to network resources based on user identity."
              }
          ],
          "asset_id": "projects/mateo-burillo-ns/zones/us-central1-a/instances/gke-promhub-default-pool-*****-*****",
          "resource_group": null,
          "sa_rule_severity": "High",
          "resource_category": "Compute",
          "object": "gke-promhub-default-pool-*****-*****",
          "object_type": "Compute Instance",
          "account_id": "mateo-burillo-ns",
          "account_name": "mateo-burillo-ns",
          "iaas_asset_tags": [
              {
                  "name": "goog-gke-node",
                  "value": ""
              }
          ],
          "asset_object_id": "*****",
          "iaas_remediated": "false",
          "user": "***@example.com",
          "timestamp": 1628551373,
          "policy_id": 1,
          "policy": "Sysdig GCP Asesment",
          "sa_profile_id": -*****,
          "sa_profile_name": "NIST CSF v1.1 (GCP)",
          "alert_name": "Remote access: Ensure \"Block Project-wide SSH keys\" enabled for VM instances",
          "os": "unknown",
          "device": "other",
          "browser": "unknown",
          "count": 1,
          "organization_unit": "",
          "userkey": "***@example.com",
          "ur_normalized": "***@example.com",
          "site": "Google Cloud Platform",
          "traffic_type": "CloudApp",
          "ccl": "excellent",
          "acked": "false",
          "_insertion_epoch_timestamp": 1628551674,
          "_id": "d7*****94",
          "cci": 93,
          "sa_rule_remediation": "\n\nUsing Console:\n1. Go to the VM instances page using https://console.cloud.google.com/compute/instances?. It will list all the instances from project\n2. Click on the name of the Impacted instance\n3. Click Edit in the toolbar\n4. Under SSH Keys, go to the Block project-wide SSH keys checkbox\n5. To block users with project-wide SSH keys from connecting to this instance, select\n      Block project-wide SSH keys\n6. click Save at the bottom of the page\n7. Repeat steps for every impacted Instance\n\nvia CLI gcloud:\nBlock project-wide public SSH keys, set the metadata value to TRUE:\ngcloud compute instances add-metadata [INSTANCE_NAME] --metadata block- project-ssh-keys=TRUE\n\nwhere [INSTANCE_NAME] is the name of the instance that you want to block project-wide public SSH keys.\n\nImpact:\nUsers already having Project-wide ssh key pairs and using third party SSH clients will lose access to the impacted Instances. For Project users using gcloud or GCP Console based SSH option, no manual key creation and distribution is required and will be handled by GCE (Google compute Engine) itself. To access Instance using third party SSH clients Instance specific SSH key pairs needs to be created and distributed to the required users.\n\nDefault Value:\nBy Default Block Project-wide SSH keys is not enabled.\n\n\n",
          "appcategory": "IaaS/PaaS",
          "isAlert": true,
          "utcTime": "2021-08-09T23:22:53Z"
      }
  ]
}
Context Data

The data extracted from Raw Data converted into JSON format. Context Data may be identical to Raw Data in some cases.

D3 customizes the Context Data by extracting the data from path $.data in API returned JSON.

It is recommended to refer to the Raw Data instead of Context Data, since it contains the complete API response data. D3 will deprecate Context Data in the future, and playbook tasks using Context Data will be replaced with Raw Data.

SAMPLE DATA

CODE
[
      {
          "access_method": "API Connector",
          "type": "nspolicy",
          "category": "IaaS/PaaS",
          "alert": "yes",
          "alert_type": "Security Assessment",
          "activity": "Introspection Scan",
          "action": "alert",
          "app": "Google Cloud Platform",
          "instance_id": "***@example.com",
          "region_id": "us-central1",
          "region_name": "Council Bluffs, Iowa, USA",
          "sa_rule_id": -*****,
          "sa_rule_name": "Remote access: Ensure \"Block Project-wide SSH keys\" enabled for VM instances",
          "compliance_standards": [
              {
                  "id": -*****,
                  "standard": "CSA-CCM-3.0.1",
                  "section": "IAM",
                  "control": "04",
                  "reference_url": "https://cloudsecurityalliance.org/research/cloud-controls-matrix/",
                  "description": "Identity & Access Management: Policies and Procedures | Policies and procedures shall be established to store and manage identity information about every person who accesses IT infrastructure and to determine their level of access. Policies shall also be developed to control access to network resources based on user identity."
              }
          ],
          "asset_id": "projects/mateo-burillo-ns/zones/us-central1-a/instances/gke-promhub-default-pool-*****-*****",
          "resource_group": null,
          "sa_rule_severity": "High",
          "resource_category": "Compute",
          "object": "gke-promhub-default-pool-*****-*****",
          "object_type": "Compute Instance",
          "account_id": "mateo-burillo-ns",
          "account_name": "mateo-burillo-ns",
          "iaas_asset_tags": [
              {
                  "name": "goog-gke-node",
                  "value": ""
              }
          ],
          "asset_object_id": "*****",
          "iaas_remediated": "false",
          "user": "***@example.com",
          "timestamp": 1628551373,
          "policy_id": 1,
          "policy": "Sysdig GCP Asesment",
          "sa_profile_id": -*****,
          "sa_profile_name": "NIST CSF v1.1 (GCP)",
          "alert_name": "Remote access: Ensure \"Block Project-wide SSH keys\" enabled for VM instances",
          "os": "unknown",
          "device": "other",
          "browser": "unknown",
          "count": 1,
          "organization_unit": "",
          "userkey": "***@example.com",
          "ur_normalized": "***@example.com",
          "site": "Google Cloud Platform",
          "traffic_type": "CloudApp",
          "ccl": "excellent",
          "acked": "false",
          "_insertion_epoch_timestamp": 1628551674,
          "_id": "d7*****94",
          "cci": 93,
          "sa_rule_remediation": "\n\nUsing Console:\n1. Go to the VM instances page using https://console.cloud.google.com/compute/instances?. It will list all the instances from project\n2. Click on the name of the Impacted instance\n3. Click Edit in the toolbar\n4. Under SSH Keys, go to the Block project-wide SSH keys checkbox\n5. To block users with project-wide SSH keys from connecting to this instance, select\n      Block project-wide SSH keys\n6. click Save at the bottom of the page\n7. Repeat steps for every impacted Instance\n\nvia CLI gcloud:\nBlock project-wide public SSH keys, set the metadata value to TRUE:\ngcloud compute instances add-metadata [INSTANCE_NAME] --metadata block- project-ssh-keys=TRUE\n\nwhere [INSTANCE_NAME] is the name of the instance that you want to block project-wide public SSH keys.\n\nImpact:\nUsers already having Project-wide ssh key pairs and using third party SSH clients will lose access to the impacted Instances. For Project users using gcloud or GCP Console based SSH option, no manual key creation and distribution is required and will be handled by GCE (Google compute Engine) itself. To access Instance using third party SSH clients Instance specific SSH key pairs needs to be created and distributed to the required users.\n\nDefault Value:\nBy Default Block Project-wide SSH keys is not enabled.\n\n\n",
          "appcategory": "IaaS/PaaS",
          "isAlert": true,
          "utcTime": "2021-08-09T23:22:53Z"
      }
]
Key Fields

Common cyber security indicators such as unique IDs, file hash values, CVE numbers, IP addresses, etc., will be extracted from Raw Data as Key Fields.
The system stores these key fields in the path $.[playbookTask].outputData. You can use these key-value pairs as data points for playbook task inputs.

SAMPLE DATA

CODE
{
    "IDs": "\"[\\\"*****\\\"]\""
}
Return Data

Indicates one of the possible command execution states: Successful or Failed.

The Failed state can be triggered by any of the following errors:

  • A connection issue with the integration

  • The API returned an error message

  • No response from the API

You can view more details about an error in the Error tab.

Return Data can be passed down directly to a subsequent command or used to create conditional tasks in playbooks.

SAMPLE DATA

CODE
Successful
Result

Provides a brief summary of outputs in an HTML formatted table.

SAMPLE DATA

ACCESS_METHOD

TYPE

CATEGORY

ALERT

ALERT_TYPE

ACTIVITY

ACTION

APP

INSTANCE_ID

REGION_ID

REGION_NAME

SA_RULE_ID

SA_RULE_NAME

COMPLIANCE_STANDARDS

ASSET_ID

RESOURCE_GROUP

SA_RULE_SEVERITY

RESOURCE_CATEGORY

OBJECT

OBJECT_TYPE

ACCOUNT_ID

ACCOUNT_NAME

IAAS_ASSET_TAGS

ASSET_OBJECT_ID

IAAS_REMEDIATED

USER

TIMESTAMP

POLICY_ID

POLICY

SA_PROFILE_ID

SA_PROFILE_NAME

ALERT_NAME

OS

DEVICE

BROWSER

COUNT

ORGANIZATION_UNIT

USERKEY

UR_NORMALIZED

SITE

TRAFFIC_TYPE

CCL

ACKED

_INSERTION_EPOCH_TIMESTAMP

_ID

CCI

SA_RULE_REMEDIATION

APPCATEGORY

ISALERT

UTCTIME

API Connector

nspolicy

IaaS/PaaS

yes

Security Assessment

Introspection Scan

alert

Google Cloud Platform

***@example.com

us-***

Council Bluffs, Iowa, USA

-***

Remote access: Ensure ";Block Project-wide SSH keys"; enabled for VM instances

projects/mateo-burillo-ns/zones/us-central1-a/instances/gke-promhub-default-pool-***-chcr

High

Compute

gke-promhub-default-pool-***-chcr

Compute Instance

mateo-burillo-ns

mateo-burillo-ns

[
{
";name";: ";goog-gke-node";,
";value";: ";";
}
]

***

false

test@example.com

1628551373

1

Sysdig GCP Asesment

-***

NIST CSF v1.1 (GCP)

Remote access: Ensure ";Block Project-wide SSH keys"; enabled for VM instances

unknown

other

unknown

1

test@example.com

test@example.com

Google Cloud Platform

CloudApp

excellent

false

1628551674

d*****45a94

93

IaaS/PaaS

True

8/9/2021 11:22:53 PM

Fetch Event Field Mapping

Please note that Fetch Event commands require event field mapping. Field mapping plays a key role in the data normalization process part of the event pipeline. Field mapping converts the original data fields from the different providers to the D3 fields which are standardized by the D3 Model. Please refer to Event and Incident Intake Field Mapping for details.

If you require a custom field mapping, click + Add Field to add a custom field mapping. You may also remove built-in field mappings by clicking x. Please note that two underscore characters will automatically prefix the defined Field Name as the System Name for a custom field mapping. Additionally, if an input Field Name contains any spaces, they will automatically be replaced with underscores for the corresponding System Name.

As a system integration, the Netskope integration has some pre-configured field mappings for default field mapping.

  • Default Event Source
    The Default Event Source is the default set of field mappings that are applied when this fetch event command is executed. For out-of-the-box integrations, you will find a set of field mapping provided by the system. Default event source provides field mappings for common fields from fetched events . The default event source has a “Main Event JSON Path” (i.e., $.data) that is used to extract a batch of events from the response raw data. Click Edit Event Source to view the “Main Event JSON Path”.

    • Main Event JSON Path: $.data
      The Main Event JSON Path determines the root path where the system starts parsing raw response data into D3 event data. The JSON path begins with $, representing the root element. The path is formed by appending a sequence of child elements to $, each separated by a dot (.). Square brackets with nested quotation marks ([‘...’]) should be used to separate child elements in JSON arrays.
      For example, the root node of a JSON Path is data. The child node denoting the Unique Event Key field would be _id. Putting it together, the JSON Path expression to extract the Unique Event Key is $.data._id.

The pre-configured field mappings are detailed below:

Field Name

Source Field

Unique Event Key

._id

Event Type

.type

Alert Timestamp

.timestamp

Username

.user

Source IP address

.srcip

Destination IP address

.dstip

Device IP address

.userip

App

.app

Event category

.category

Aggregated / Correlated Event count

.count

URL

.url

Host FQDN

.domain

Hostname

.hostname

SourceTime

.src_time

Severity

.severity

Operating system

.os

File Hash MD5

.md5

Filesize

.file_size

object

.object

object category

.object_type

Destination port

.dstport

CloudConfidenceLevel

.ccl

CloudConfidenceIndex

.cci

SourceCountry

.src_country

DestinationCountry

.dst_country

SourceRegion

.src_region

DestinationRegion

.dst_region

EpochTimestamp

._insertion_epoch_timestamp

Browser

.browser

SourceLocation

.src_location

DestinationLocation

.dst_location

Activity

.activity

Start Time

.utcTime

Error Handling

If the Return Data is Failed, an Error tab will appear in the Test Result window.

The error tab contains the details responded from D3 SOAR or third-party API calls, including Failure Indicator, Status Code, and Message. This can help you locate the root cause of a command failure.

Parts in Error

Description

Example

Failure Indicator

Indicates the command failure that happened at a specific input and/or API call.

Fetch Event failed.

Status Code

The response code issued by the third-party API server or the D3 SOAR system that can be used to locate the corresponding error category. For example, if the returned status code is 401, the selected connection is unauthorized to run the command. The user or system support would need to check the permission setting in the Netskope portal. Refer to the HTTP Status Code Registry for details.

Status Code: 400.

Message

The raw data or captured key error message from the integration API server about the API request failure.

Message: The type \"Page\" of the alerts to filter by is not valid.

Error Sample Data

Fetch Event failed.

Status Code: 400.

Message: The type \"Page\" of the alerts to filter by is not valid.

Test Connection

Allows you to perform a health check on an integration connection. You can schedule a periodic health check by selecting Connection Health Check when editing an integration connection.

Input

N/A

Output

Return Data

Indicates one of the possible command execution states: Successful or Failed.

The Failed state can be triggered by any of the following errors:

  • A connection issue with the integration

  • The API returned an error message

  • No response from the API

You can view more details about an error in the Error tab.

SAMPLE DATA

CODE
Successful

Error Handling

If the Return Data is Failed, an Error tab will appear in the Test Result window.

The error tab contains the details responded from D3 SOAR or third-party API calls, including Failure Indicator, Status Code, and Message. This can help you locate the root cause of a command failure.

Parts in Error

Description

Example

Failure Indicator

Indicates the command failure that happened at a specific input and/or API call.

Test Connection failed. Failed to check the connector.

Status Code

The response code issued by the third-party API server or the D3 SOAR system that can be used to locate the corresponding error category. For example, if the returned status code is 401, the selected connection is unauthorized to run the command. The user or system support would need to check the permission setting in the Netskope portal. Refer to the HTTP Status Code Registry for details.

Status Code: 403.

Message

The raw data or captured key error message from the integration API server about the API request failure.

Message: no Route matched with those values.

Error Sample Data

Test Connection failed. Failed to check the connector.

Status Code: 403.

Message: no Route matched with those values.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.