Cybersecurity
DevOps Cloud
IT Operations Cloud
Identity Governance is a new product from Micro Focus that allows you to validate that your users, accounts, and permissions they obtain are in compliance and approved. You can think of it as a way of proving that your Identity Management service is working properly.
Like any complex application, you need to be able to debug it, and Identity Governance allows you to enable different trace levels. Unfortunately the meaning and utility of each log level is not very well explained in the documentation. To try and remedy that issue I am working on a Wiki page about the various log levels here:
https://wiki.microfocus.com/index.php/Identity_Governance
My hope is that others will find an issue, figure out a trace level to enable to troubleshoot it, and update the Wiki page so we can all share in the knowledge.
As part of that, I wanted to try and dissect some of the log to try and identify what should be of interest.
In this case, I was configuring a new Account Collector, connecting to a JDBC database. It was Microsoft SQL and I started trying to use the MS SQL Template, but it was not connecting. Why? Well initially thought it was firewall related, but we could telnet from the Identity Governance server to the expected port, so knew it was not connectivity issues.
After some thought a useful hint was provided in the support forums to an issue I saw in another issue, to enable the com.netiq.daas class which is related to the Directory As a Service web application, and see if that shows anything of interest. I was just getting a 486 error, with no real explanation when I tried to test the connection.
So I edited the file /opt/netiq/idm/apps/tomcat/conf/daas-logging.xml and changed the com.netiq.daas value from INFO to ALL since I had no idea what I was looking for yet. Once I get some good ideas, I will be able to tune it down lower.
First up we saw this show up:
[FINE] 2018-08-22 08:31:00 com.netiq.daas.daaservice.ServiceProviderMap clean - [DAAS] Collection cleaner running...
[FINE] 2018-08-22 08:31:00 com.netiq.daas.nativejdbcservice.ConnectorFactory createConnector - [DAAS] Creating Microsoft SQLServer Specific Connector
[FINE] 2018-08-22 08:31:00 com.netiq.daas.nativejdbcservice.connector.DBConnector setDriverClassName - [DAAS] Driver Class Name com.microsoft.sqlserver.jdbc.SQLServerDriver
[FINE] 2018-08-22 08:31:00 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] setting connectionProperties
[FINE] 2018-08-22 08:31:00 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] Key :databaseName value 'Test' added to connectionProperties
[FINE] 2018-08-22 08:31:00 com.netiq.daas.common.SrvInstance - [DAAS] New service instance. TTL: 60
[FINE] 2018-08-22 08:31:00 com.netiq.daas.common.SrvInstance resetTimeout - [DAAS] Reset timeout for service instance to TTL: 60
[FINE] 2018-08-22 08:31:00 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] setting connectionProperties
[FINE] 2018-08-22 08:31:00 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] Key :password value '' added to connectionProperties
[FINE] 2018-08-22 08:31:00 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] Key :user value 'domain\username' added to connectionProperties
[FINEST] 2018-08-22 08:31:00 com.netiq.daas.nativejdbcservice.connector.DBConnector getConnectionProperties - [DAAS] accessor call to connectionProperties
[FINE] 2018-08-22 08:31:00 com.netiq.daas.nativejdbcservice.connector.DBConnector establishDBConnection - [DAAS] Caught SQLException: Login failed for user 'domain\username'. ClientConnectionId:00d9acdf-139d-4a9f-b8b0-baf0a0ad7d27
This then returned a 486 error. This is a bit more helpful. It looks like a username password problem. We double checked with a SQL tool and could connect. It would have been helpful to return the full error message since that would have offered me more information to work with.
I see that the username and password are called out but alas, the password is hidden. That is both good and bad. Good in that I would not want to log out a password in clear text, but also bad since when I am trying to see if a complex password is breaking it, seeing how it handles the password can help.
We thought the issue was, we were set to Active Directory integrated authentication in this database, so we did not have a database account, rather the domain account has access to the databases.
Now I knew that the open source JTDS version of a JDBC driver was as good or better than the Microsoft provided JDBC driver. So I went to the SourceForge project and grabbed the JAR file:
http://jtds.sourceforge.net/
Then I recreated my Collector with the JDBC (JTDS) Account template. Interestingly, once you create a collector you cannot change the Template used. Which is a shame.
The various database Templates preset the needed values, that differ between the database types. For example, the Microsoft SQL server template lets you specify an instance Name (an issue I ran into with the main database connections getting started, short answer, either port or instanceName value in the connect string, not both) and does not ask you (nor let you) specify the class of the JDBC driver. This is helpful to avoid having to track down the proper value.
There is a generic JDBC Template where you can specify any or all of the values for the connection.
On the JTDS template it wants you to provide a connect string, which I did as follows:
jdbc:jtds:sqlserver://ms-sql.acme.com:1433/Test;domain=acme
For contrast the similar value for the MS drivers connect string would have been:
jdbc:sqlserver://ms-sql.acme.com:1433;databaseName=Test
In this case, our added on property of domain=acme, means that we are trying to use the username provided, through the specified domain which we thought was the root of our problem connecting. This time it did work! Our log looked like this:
[FINE] 2018-08-22 13:49:09 com.netiq.daas.daaservice.ServiceProviderMap clean - [DAAS] Collection cleaner running...
[FINE] 2018-08-22 13:49:09 com.netiq.daas.nativejdbcservice.ConnectorFactory createConnector - [DAAS] Creating GENERIC Connector, this can used to collect data from any data-base
[FINE] 2018-08-22 13:49:09 com.netiq.daas.nativejdbcservice.connector.DBConnector setDriverClassName - [DAAS] Driver Class Name net.sourceforge.jtds.jdbc.Driver
[FINE] 2018-08-22 13:49:09 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] setting connectionProperties
[FINE] 2018-08-22 13:49:09 com.netiq.daas.common.SrvInstance - [DAAS] New service instance. TTL: 60
[FINE] 2018-08-22 13:49:09 com.netiq.daas.common.SrvInstance resetTimeout - [DAAS] Reset timeout for service instance to TTL: 60
[FINE] 2018-08-22 13:49:09 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] setting connectionProperties
[FINE] 2018-08-22 13:49:09 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] Key :password value '' added to connectionProperties
[FINE] 2018-08-22 13:49:09 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] Key :user value 'username' added to connectionProperties
[FINEST] 2018-08-22 13:49:09 com.netiq.daas.nativejdbcservice.connector.DBConnector getConnectionProperties - [DAAS] accessor call to connectionProperties
[FINE] 2018-08-22 13:49:09 com.netiq.iac.persistence.dcs.dce.daas.DaaSService testConnection - [IG-SERVER] DaaS Service Test response code: 204
I think showing the full connect string used in this trace would have been nicer, but it does not seem to log that information.
Once we got the connection working, we needed to start collecting data. One of the first things you need to do is to define a SQL statement that returns the data that is to be collected. We used something like this:
select u.DisplayName, u.IDCode, org.Name AS OrgUnit, u.OccupationCode,u.active,u.ExpiryDt from dbo.MyUser u left outer join MyOrgUnit org on org.guid = u.orgunitguid where u.active=1 and (u.ExpiryDt is NULL or u.ExpiryDt > getdate()) order by OrgUnit, u.OccupationCode, u.DisplayName
I used the Test Connection option (bottom of the Collector definition page) and chose to collect the first 5 rows before the transformations are applied. This actually is a useful test tool. You can collect the first 10 items, before transformation. Look (or copy) the raw data. Then you can do it again with transformations applied (Things like the ECMA statements that process the data before storing it) and then compare the data to see how well your transform worked.
There is actually a tool, provided with the product /opt/netiq/idm/apps/idgov/bin/transform-tester.sh, that is an X Windows based test tool for transforms. You can provide a file with the ECMA being run, then specify an input value (which you just got from the test collection) and work on it until it all works.
I had never watched a collection through the lens of the daas class logging enabled and thought it would be interesting to look at and dissect.
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.ServiceView - [DAAS] InputTransformsMap:
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.ServiceView - [DAAS] {}
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] SchemaMap:
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] m_appKeyMap:
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] {IDCode=[accountId, userLink, userId], DisplayName=[name], active=[disabled], ExpiryDt=[ext_accountExpires], OrgUnit=[ext_orgUnit-MyApp], OccupationCode=[ext_occupationCode-MyApp]}
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] m_viewKeyMap:
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] {accountId=IDCode, name=DisplayName, disabled=active, userLink=IDCode, userId=IDCode, ext_accountExpires=ExpiryDt, ext_orgUnit-MyApp=OrgUnit, ext_occupationCode-MyApp=OccupationCode}
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] m_staticValueMap:
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] {privileged=false, ext_colAccountSource=MyAppDB}
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] m_compositeMap:
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] {}
So it looks like the first thing it does is take the configuration I defined for my collector, and it groups them.
The m_appKeyMap looks like how it maps the data. I used IDCode in the returned data to define three different fields which you see in the JSON array of accountId, userLink, and userId. The rest are all single maps of one data field to an IDG attribute.
Then we have the m_viewKeyMap, which is more about will be shown about the user I guess. I think this suggests that the data is not collected as the basic raw data and then referenced from the various fields, rather it is copied into the various data fields during the collection. This is probably a simpler way of doing it, but uses up more disk space. But I guess disk space is cheap these days.
Then we have the m_staticValueMap which shows the attributes that have static values assigned. This is interesting, and I guess a more efficient way of doing it. This offers insight into a trick I figured out. For my testing I wanted to set manager/owner/sponsor for a Collected account or permission to be only my account. So I could test Reviews and only affect me. However these fields where you map a data element to reference an Account or Identity, sort of de referencing a DN style reference, do not let you specify a static string.
But you can specify any attribute in the data and hit the {..} button on the right, to get to the ECMA Transform view, and simply use the ECMA of:
outputvalue="gcarman";
And now all values get the object that gcarman will identify. However, I knew this would be more inefficient than other options, but this shows me that I was even more correct, since the collector knows to predefine all the static values, which is actually sort of clever.
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.SchemaMap trace - [DAAS] SchemaMap:
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.ServiceView getViewParms - [DAAS] Query Object: {"sql-query": "select u.DisplayName, u.IDCode, org.Name AS OrgUnit, u.OccupationCode,u.active,u.ExpiryDt from dbo.MyUser u left outer join MyOrgUnit org on org.guid = u.orgunitguid where u.active=1 and (u.ExpiryDt is NULL or u.ExpiryDt > getdate()) order by OrgUnit, u.OccupationCode, u.DisplayName"}
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.ServiceView buildCollectionQuery - [DAAS] service: com.netiq.daas.daaservice.util.Service@3d64c373, view: account
Here we get the SQL query reported in trace raw, which is useful to see.
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.ServiceView buildCollectionQuery - [DAAS] view collectionQuery: {
"type": "find",
"command": {
"sql-query": "select u.DisplayName, u.IDCode, org.Name AS OrgUnit, u.OccupationCode,u.active,u.ExpiryDt from dbo.MyUser u left outer join MyOrgUnit org on org.guid = u.orgunitguid where u.active=1 and (u.ExpiryDt is NULL or u.ExpiryDt > getdate()) order by OrgUnit, u.OccupationCode, u.DisplayName",
"read-attrs": [
"accountId",
"name",
"disabled",
"userLink",
"userId",
"ext_accountExpires",
"ext_orgUnit-MyApp",
"ext_occupationCode-MyApp"
]
}
}
Now it picks apart the query, and builds some JSON to send to the DaaS service to process. The find is clear the query. It sends a command with a JSON element of sql-query, followed by the query itself, then a read-attrs followed by a JSON array. What is interesting is that the read-attrs uses the IDG side attribute names, not the native application names, nor defines their mapping. Clearly that info, which we saw above in the m_appKeyMap is available to the query.
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.ServiceProviderMap clean - [DAAS] Collection cleaner running...
[FINE] 2018-08-22 13:59:46 com.netiq.daas.nativejdbcservice.ConnectorFactory createConnector - [DAAS] Creating GENERIC Connector, this can used to collect data from any data-base
[FINE] 2018-08-22 13:59:46 com.netiq.daas.nativejdbcservice.connector.DBConnector setDriverClassName - [DAAS] Driver Class Name net.sourceforge.jtds.jdbc.Driver
[FINE] 2018-08-22 13:59:46 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] setting connectionProperties
[FINE] 2018-08-22 13:59:46 com.netiq.daas.common.SrvInstance - [DAAS] New service instance. TTL: 60
[FINE] 2018-08-22 13:59:46 com.netiq.daas.daaservice.util.QueryProcessor chunkTransfer - [DAAS] calling preProcess...
[FINE] 2018-08-22 13:59:47 com.netiq.daas.daaservice.util.Transformation requestMap - [DAAS] Got REQUEST schema map...
I guess this part here is where it gets the schema map we discussed being defined so it can use the IDG side attributes in the results. That makes a bit more sense now.
[FINE] 2018-08-22 13:59:47 com.netiq.daas.daaservice.util.Transformation requestMap - [DAAS] requestMap: in viewCommand: {
"sql-query": "select u.DisplayName, u.IDCode, org.Name AS OrgUnit, u.OccupationCode,u.active,u.ExpiryDt from dbo.MyUser u left outer join MyOrgUnit org on org.guid = u.orgunitguid where u.active=1 and (u.ExpiryDt is NULL or u.ExpiryDt > getdate()) order by OrgUnit, u.OccupationCode, u.DisplayName",
"read-attrs": [
"accountId",
"name",
"disabled",
"userLink",
"userId",
"ext_accountExpires",
"ext_orgUnit-MyApp",
"ext_occupationCode-MyApp"
]
}
Not sure why this class repeats the query definition from above, it does not look like it remapped it, but probably some task was performed and its shows the results, just none were relevant in my example.
[FINE] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] setting connectionProperties
[FINE] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] Key :password value '' added to connectionProperties
[FINE] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.connector.DBConnector setConnectionProperties - [DAAS] Key :user value 'username' added to connectionProperties
We saw these lines before when we tested the database connection.
[FINE] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector setCollectionCriteria - [DAAS] select u.DisplayName, u.IDCode, org.Name AS OrgUnit, u.OccupationCode,u.active,u.ExpiryDt from dbo.MyUser u left outer join MyOrgUnit org on org.guid = u.orgunitguid where u.active=1 and (u.ExpiryDt is NULL or u.ExpiryDt > getdate()) order by OrgUnit, u.OccupationCode, u.DisplayName
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.connector.DBConnector getConnectionProperties - [DAAS] accessor call to connectionProperties
[FINE] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector executeQuery - [DAAS] Execute query: select u.DisplayName, u.IDCode, org.Name AS OrgUnit, u.OccupationCode,u.active,u.ExpiryDt from dbo.MyUser u left outer join MyOrgUnit org on org.guid = u.orgunitguid where u.active=1 and (u.ExpiryDt is NULL or u.ExpiryDt > getdate()) order by OrgUnit, u.OccupationCode, u.DisplayName
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.NativeJDBCService executeJSONChunkRequest - [DAAS] Try to get data...
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: DisplayName, type: 12
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: IDCode, type: 12
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: OrgUnit, type: 12
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: OccupationCode, type: 12
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: active, type: -7
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: ExpiryDt, type: 93
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: DisplayName, type: 12
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: IDCode, type: 12
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: OrgUnit, type: 12
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: OccupationCode, type: 12
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: active, type: -7
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.JDBCCollector getColumnData - [DAAS] Column name to map: ExpiryDt, type: 93
Now I trimmed this log, since it found about 783 entries, and even though in the test I said I only wanted the first 5 for my test, it looks like it got all of them, and then only showed me the first 5. I expect this is related to an API issue where they do not want to handle adding into the existing SQL a limit of # of rows returned. Anytime you try to programmatically append SQL commands with your own, you are opening a support can of worms, since people do the darndest things in SQL statements. I trimmed this down to two elements so you can see the basics.
I am still trying to debug this since I know I need to convert the ExpiryDt data from whatever the native format is, to the proper format for IDG. I will look at the code for Login Expiration Time collection in the eDirectory collector when the time comes as a template. That will tell me the proper format I am aiming for, and once I know the source format I can figure out an ECMA conversion between the two. If I cannot, I am sure the great Google will find me an example.
I notice that each line gets a 'type' and I see 12 looks like string, -7 (negative 7? Odd) looks like boolean and type 93 is either NULL (which is what the DB has for these users) or else a date string data type.
Boy would it not be nice to have this list of attribute types mapped? Or exposed? Perhaps this is an internal schema definition? Imagine if the issue was that DaaS was mapping the fields to the wrong schema attribute type. If you knew what the proper types were, this could reveal the issue.
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.NativeJDBCService executeJSONChunkRequest - [DAAS] Back from batch get...
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.NativeJDBCService executeJSONChunkRequest - [DAAS] All results obtained. Release Collector
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.daaservice.util.Transformation postProcess - [DAAS] postProcess result: {
"STATUS": "success",
"STATUS_TEXT": "unavailable",
"SIZE": 783,
"Results": [
{
"DisplayName": "Config, SCM",
"IDCode": "SCMConfig",
"active": true
},
{
"DisplayName": "JOHN SMITH",
"OccupationCode": "PC",
"active": true
}
]
}
[FINEST] 2018-08-22 13:59:47 com.netiq.daas.daaservice.util.Transformation postProcess - [DAAS] doTransform flag = false. No data transforms executed.
[INFO] 2018-08-22 13:59:47 com.netiq.daas.nativejdbcservice.NativeJDBCService shutdown - [DAAS] Received service shutdown from DaaS
Here we see the results rendered as JSON.
Of interest is the STATUS of success, but oddly the STATUS_TEXT of unavailable. I imagine some data sources return a Status code, perhaps like LDAP (80 for failed connection) possibly with some text and it tries to return both the error code and any text explanations of the error as well. Clever.
Then we see the SIZE, which shows how many 'rows' or entries were returned. Annoying as noted above that to get the first five requires getting them all.
Then the Results array of results. I notice that the empty fields are ignored.
Finally we see that the doTransform flag is false (since my test was without transformations) so no data transforms were applied. Now I look forward to seeing what the data transforms look like in this view when I next get a chance to run it!
Anyway, I hope that was of general interest to folk in seeing how IDG logs information when doing Collections. I expect I will have many more of these such articles to write as there is so much to trace out and dissect in this product.