DB 360, Datasource Management and Automating GrdApi Calls

In a mature database security deployment multiple security data sets pertaining to databases are collected and need to be merged together. The DB360 application brings together the data from agents, sessions, vulnerability assessments and sensitive data scans. It allows you to view all this data per server in a consolidated manner, allows you to view trending information and progress over time, and allows you to view current status vs. a target goal.

To enable DB 360 enter Administration/Engines and enable the DB 360 engine. The DB 360 dashboard will be built automatically as will the db360 data collection. It will take a few hours to build; the schedule is controlled through the “Run Engine Every X hours” configuration parameter in the Machine Learning Engine configuration.

As part of the DB 360 engine, DNS lookup and reverse lookup is done to try to normalize the data indexed across disparate data sources. For example, session data always includes IP addresses and only sometimes includes hostnames. Datasources can be defined using IP address, fully qualified domain names, or other hostnames. You should try to define VA and classifier scans in a consistent way using IP addresses or fully qualified names. DCAP Central will also try to normalize this data as part of the DB 360 application by doing lookups and reverse lookups - which require the system on which DCAP Central is running to be properly configured for DNS resolution.

You can configure this dual lookup on any collection you have. To do so insert a document into the address_resolution_requests collection in lmrm__sonarg. For example:

{
      "db" : "lmrm__ae",
      "col" : "db360",
      "ipField" : "Server_IP",
      "hostField" : "Server_Host_Name",
      "timestampField" : "Day",
      "hoursBack" : 24,
      "hoursNext" : 1,
      "nextRun" : ISODate("2016-07-26T13:15:09.550Z")
}

Specify the collection name and the database it is in, the fields where the IPs and the hostnames are located as well as an ISODate field (usually a timestamp of sorts). The timestamp field is used to limit the number of records (from hoursBack until now). hoursNext determines how often the job runs.

DCAP Central also allows you to manage datasources centrally. Feeds from other systems (Guardium or others) may be managed through the ETL layer. To have Guardium datasource feeds update the managed datasources add the following to the ETL config in /etc/sonar/sonargd.conf:

upsert:
 datasources:
   - Datasource Id

Editing of datasources may be done by exporting/importing the data to Excel. In the administration section choose to edit datasources, select the search criteria (a regular expression that is matched with all fields of the datasources; or leave it empty to get all datasources), and you will get datasource data as an Excel spreadsheet. Modify these, add columns or add new datasources. To add a new datasource add a new row in the spreadsheet and leave the _id column empty. To delete a datasource clear the contents from all cells in the row apart from the _id value. Do not change the _id values, but you can change any other value. Once finished, import the data back into DCAP Central. Note that each line in the spreadsheet will cause an update, therefore it is better to upload a spreadsheet only with the lines that have changed.

GrdAPI Push (Deprecated)

DCAP Central can be used to automate pushes to Guardium appliances using grdapi scripts. This includes updating groups, installing policies etc. As an example, datasources managed within DCAP Central may be pushed to Guardium appliances using grdapis.

You can get a grdapi script for updating of datasources from the DCAP Central home page by selecting Datasources from the Federated Data pull-down and clicking on the Push button.

You can also automate this process through the dispatcher using an SSH push - the dispatcher runs on a scheduled basis, creates a script and then uses SSH to run the grdapi calls on the CM. Each grdapi-based interface requires a metadata definition in lmrm__sonarg /lmrm__grdapi specifying the database and collection name from which the data is to be extracted, the grdapi to be invoked and the mapping between fields. You can also add a custom pipeline for additional processing when needed. For example the update data source mapping looks like:

{
      "_id" : 1,
      "db" : "sonargd",
      "collection" : "datasources",
      "api" : "update_datasource_by_id",
      "pipeline" : "",
      "fields" : [
              {
                      "from" : "id",
                      "to" : "Datasource Id"
              },
              {
                      "from" : "conProperty",
                      "to" : "Connection Properties"
              },
              {
                      "from" : "customURL",
                      "to" : "Custom URL"
              },
              {
                      "from" : "dbInstanceAccount",
                      "to" : "DB Instance Account"
              },
              {
                      "from" : "dbInstanceDirectory",
                      "to" : "DB Instance Directory"
              },
              {
                      "from" : "dbName",
                      "to" : "Database Name"
              },
              ...
      ]
}

If you need multiple grdapi calls to run from the same collection use the optional seq field. For example, you could have these two records in lmrm__grdapi one generating a group with %% and one without:

{
  "_id" : ObjectId("58cbdc8e64fed4fba2e0ffb1"),
  "db" : "sonargd",
  "collection" : "LDAPData",
  "api" : "create_member_to_group_by_desc desc=\"Group1\" ",
  "pipeline" : "[{'$project':{_id:{$toString:'$_id'},cn: {$concat:['%','$cn','%']}}}]",
  "fields" : [
    {
      "from" : "member",
      "to" : "cn"
    }
  ],
  "seq" : "1"
}
{
  "_id" : ObjectId("58cbdcba64fed4fba2e0ffb3"),
  "db" : "sonargd",
  "collection" : "LDAPData",
  "api" : "create_member_to_group_by_desc desc=\"Group2\" ",
  "pipeline" : "",
  "fields" : [
    {
      "from" : "member",
      "to" : "cn"
    }
  ],
  "seq" : "2"
}

To add the item to the Fedrated Data on the main DCAP Central screen add it to the managed_collections collection in the lmrm__sonarg database, for example:

{
  "_id" : ObjectId("58caa12fd697bec74d1e1c62"),
  "name" : "LDAP Users",
  "db" : "sonargd",
  "col" : "LDAPData"
}
{
  "_id" : ObjectId("58cbdcde64fed4fba2e0ffb4"),
  "name" : "LDAP Users1",
  "seq" : "1",
  "db" : "sonargd",
  "col" : "LDAPData"
}
{
  "_id" : ObjectId("58cbdce964fed4fba2e0ffb5"),
  "name" : "LDAP Users2",
  "seq" : "2",
  "db" : "sonargd",
  "col" : "LDAPData"
}

GrdAPI Through Dispatcher

Starting with version 3.0 DCAP Central has an easier-to-use mechanism for making grdapi calls that does not require setting up metadata.

GrdAPI push data can also be retrieved through the Gateway as a Web service using the grdapi type. Any pipeline that projects a single field called grdapi may be used. Once you define the pipeline you can invoke it using a scheduled job making sure that output=grdapi. For example:

https://<your host>:8443/Gateway?type=agg&output=grdapi&
  col=datasources&db=sonargd&overrideDB=sonargd&name=update_datasource_by_id

Then, schedule your job and specify one or more targets in “Destinations to copy to”. These are named sections in dispatcher.conf. Each such section is a target Guardium appliance (usually a CM). Each such section has:

[cm_name]
copy_host=<cm hostname?
copy_port=22
copy_username=<grdapi username>
copy_password=<grdapi pwd>
copy_keyfile=
copy_dest=
copy_type=ssh
copy_pre=<filename>
copy_post=<filename>

Example - Updating a Guardium Group for Trusted Connections

In this example assume a Guardium group for tuples used in a policy to exclude records from service accounts and white-listed connections. These lists are generated by an automated DCAP Central process that usually includes customer-specific requirements. All new trusted connections are added into a collection called trusted_connections. An automated daily run extacts all new records and generates a script with grdapi calls to update the Guardium group (called – Trusted Connections) and reinstall the policy.

To configure this scenario you need:

  • The pipeline that adds records to trusted_connections (this is usually implemented based on specific requirements or using the SAGE profiling engine)

  • A scheduled job that pulls any incremental records in trusted_connections and makes a script. The job is based on a standard pipeline that filters only new records that have been added to trusted_connection as of the last run. The scheduled job uses a URL that has type=grdapi telling the system to generate a grdapi script. In the scheduler put in a name for “Destinations to copy to” - this will be a section in dispatcher.conf that you configure to allow SSH-ing to the Guardium appliance.

  • Configure SSH credentials in dispatcher.conf

  • Add a pre or post script. For example, if the group is used from within a policy and DCAP Central will update the group, you might want to have a post section that reinstalled the policy so in dispacther.conf you might have:

    copy_post=/home/grdapi/post.api
    

    where post.api is:

    grdapi policy_install policy=ProductionPolicy
    
  • Finally, tell the system how to construct each row of the grdapi call by inserting a document into lmrm__sonarg.lmrm__grdapi:

    {
      "db" : "sonargd",
      "collection" : "trusted_connections",
      "api" : "create_member_to_group_by_desc desc=\"-- Trusted Connections\" ",
      "pipeline" :
      "[{'$project': {  _id: {$toString: '$_id'},cn:{$concat: ['$Client IP','+',
      '$Source Program','+','$DB User Name','+','$Server IP','+','$Service Name']}}}]",
      "fields" : [
        {
          "from" : "member",
          "to" : "cn"
        }
      ]
    }
    

I.e. the pipeline reads all the new records from trusted_connections and makes a line of the form:

create_member_to_group_by_desc desc=”– Trusted Connections” member=”1.1.1.1+SQLNET+SYSTEM+2.2.2.2+ORCL”

NOTE: As of DCAP Central 3.0 this process is greatly simplified. In v3.0 you can create any pipeline with a projection (you should have one field only) and then schedule the pipeline using output=grdapi; there is no longer a need to create a document in lmrm__grdapi. This is the preferred approach since it requires less work and also can use all the built-in features such as $$LAST_DID.

Datasource Password Encryption

When datasources are uploaded into DCAP Central their passwords can be encrypted. Encryption happens in-place - i.e. an uploaded cleartext password is removed and instead the ciphertext kept in a different field name. Extracts of datasources to grdapi calls through the DCAP Central home page (push button) or through the dispatcher’s use of the Gateway will replace the password back with the cleartext version. Downloads and direct access however will not include the cleartext password.

To enable this mode a metadata document should be populated in lmrm__sonarg:

db.lmrm__encrypted_fields.insert({_id: NumberInt(1), collection: “datasources”, db: “sonargd”, fields: [{from: “pwd”, to: “Password”}]})

This tells the system that uploads to datasources that include a cleartext value in the pwd field should remove the cleartext and instead save the ciphertext in Password. Then, use the Password field in the grdapi mapping document:

db.lmrm__grdapi.update({_id: NumberInt(1)}, {_id: NumberInt(1), db: “sonargd”, collection: “datasources”, api: “update_datasource_by_id”, pipeline: “”, fields: [{from: “id”, to: “Datasource Id”}, {from: “conProperty”, to: “Connection Properties”}, {from: “customURL”, to: “Custom URL”}, {from: “dbInstanceAccount”, to: “DB Instance Account”}, {from: “dbInstanceDirectory”, to: “DB Instance Directory”}, {from: “dbName”, to: “Database Name”}, {from: “newName”, to: “Datasource Name”}, {from: “type”, to: “Datasource Type”}, {from: “port”, to: “Port”}, {from: “serviceName”, to: “Service Name”},{from: “severity”, to: “Severity”}, {from: “shared”, to: “Shared”}, {from: “user”, to: “User Name”}, {from: “description”, to: “Description”}, {from: “host”, to: “Host”}, {from: “Password”, to: “Password”}]},{upsert: true})