Wikis - Page

NetIQ Access Manager Real Time Analytics Using AWS Kinesis Service

4 Likes

Public cloud deployment is supported by Access Manager version 4.4 SP1 or later and this capability of AM can be utilized to integrate with different managed services provided by the public cloud vendors.

Here is one such example of leveraging AWS managed services to develop an analytic solution for Access Manager which is deployed on AWS.

This article covers use-cases, Architecture, a sample scenario, and how to analyze and visualize the Access Manager Audit Events in Kibana Analytics Dashboard with help of various AWS Managed services.

1. Use Cases



    • For IT decision makers – to know the total number of unique users, usage of the web-resources protected by AM.

 

    • For IT/Security Administrator - to know the overall access pattern of Access Manager services so that admin is well aware of the overall requests.

 

    • And many more.



2. Architecture



Architecture diagram of NetIQ Access Manager Real Time Analytics Using AWS Kinesis Service Figure 1: Architecture diagram



The Figure 1 architecture diagram shows how to analyze and visualize the Access Manager Event logs in Kibana Analytics Dashboard using AWS Managed services.

3. List of AWS Managed Services Used



    1. Amazon Elastic Compute Cloud (Amazon EC2) service to deploy Access Manager with Kinesis Agent in AWS. The Kinesis Agent reliably gathers and streams logs, events to Kinesis Data Stream service.

 

    1. Amazon Kinesis Service used to collect, process and analyze the real-time data.

 

    1. Amazon Elasticsearch Service for data visualization using Kibana



4. Sample Scenario



    1. Consider a scenario where you need to create a dashboard to analyze user authentication related data like how many number of unique user logged in, which is the most used authentication mechanism by the user etc. For this you can use Access Manager’s login related audit events. For every user authentication Access Manager’s Identity Server component collects various data, triggers audit event and sent this data to Admin console syslog server.

 

    1. The Kinesis agent which is running in Admin Console can ingest the data and send it to Kinesis Data stream.

 

    1. The Kinesis Data stream continuously captures this data sent by the agent and stores for processing.

 

    1. The Kinesis Data Analytics, processes the data streams in real time with standard SQL. This layer take care of everything required to run the query continuously and auto scaling. This processed data stream will be sent to Kinesis Delivery stream which is Kinesis Data Firehose.

 

    1. The Kinesis Data Firehose captures, transform and loads data in to Amazon S3 and Elasticsearch for data visualization using Kibana.



How to configure AM and various AWS managed services for this sample scenario are described in the below sections.

 

5. Building Access Manager Real-Time Analytics using AWS managed services


There are three major steps involved to create this data visualization in AWS.

three major steps involved to create this data visualization in AWS

5.1. Deploy and Configure Access Manager on AWS EC2 service



    1. Deploy Access Manager with Kinesis Agent running on AWS EC2 service. Have created two Red Hat instances on AWS EC2 one for Admin Console and another one for Identity Server, the below steps are provided based on this setup.

        1. Deploying AM on AWS EC2 refer https://www.netiq.com/documentation/access-manager-44/install_upgrade/data/nam-on-cloud.html

        1. In AM Admin Console EC2 instance, install Kinesis agent. For details refer https://docs.aws.amazon.com/firehose/latest/dev/writing-with-agents.html

        1. Configuring Kinesis Agent

            1. SSH to Admin Console EC2 instance, configure /etc/aws-kinesis/agent.json as shown below. Have used AM Admin Console's syslog server. The audit events are collected in  NAM_audit.log file. In case if you are using different syslog server, configure filePattern parameter accordingly.

              Configuring Kinesis Agent Figure 2: Configuring Kinesis Agent



            1. Creating Kinesis Stream (Refer section 5.2.1 )

            1. Edit /etc/Auditlogging.cfg in NAM Admin console system,  do the below changes
              LOGDEST=console to  LOGDEST=syslog
              FORMAT=json to FORMAT=CSV

            1. Have disabled selinux.  Edit the selinux config file(/etc/selinux/config) and set SELINUX=disabled. Note this requires AC server restart.

            1. Change the template format as given below in /etc/rsyslog.d/nam.conf in AM admin console machine:
              template ForwardFormat,"%syslogtag:1:32%%msg:::sp-if-no-1st-sp%%msg%\n"
              local0.*   -/var/log/NAM_audits.log;ForwardFormat

            1. Restart  Kinesis Agent,syslog, AC tomcat server and IDP tomcat server.
              service aws-kinesis-agent restart
              systemctl rsyslog restart
              /etc/novell-ac restart
              /etc/novell-idp restart







    1. Login to AM Admin console in the browser

        1. Enable audit events that you want to analyze and create visualization. For the sample scenario mentioned have enabled “Login consumed” audit event for Identity Server. You can also configure other IDP events, AG events based on your requirement.

        1. Configure Syslog server.





NOTE: The AM Admin Console syslog server is used for collecting the event logs, hence have configured admin console IP address syslog server port. In case if different syslog server is used configure the details accordingly.



Configure Sys log server Figure 3: Configure Sys log server



5.2. Amazon Kinesis Data Analytics


 

The Kinesis Data Analytics applications continuously read and process data from streaming sources in real-time. This application uses SQL to process incoming data stream and produce the output. The Kinesis Data Analytics application does three major operations.





The below section 5.2.1 contains how to configure all these three major operations of Kinesis Data Analytics Application  using AWS console.

5.2.1. Creating Kinesis Data Analytics using AWS Management Console


 

First, create a new Kinesis Data Analytics Application in AWS Management Console, by choosing Kinesis Service -> Data Analytics -> Create application -> specify application name “nam-analytics”. Once the application "nam-analytics" created successfully, configure the three major operations are 1. Configure Kinesis Data Stream, 2. SQL code, 3. Kinesis Delivery Stream.

    1. Configure Kinesis Data Stream:



Click on Connect Streaming Data -> Create a new Kinesis stream called “nam-events” as shown in Figure 4-> click Discover schema -> save and continue. This completes connecting streaming data.



NOTE:  The Kinesis stream name must be same as it is configured earlier in Kinesis Agent configuration (agent.json) file. Have configured number of shards as 1. Based on your requirement configure the number of shards.




Create a new kinesis stream Figure 4: Create a new Kinesis stream



 

    1. SQL Code:



Write SQL query to filter login consumed event related data in Kinesis Stream. To create SQL query, Click on “Go to SQL editor” in Application -> Use the below SQL query and click on “save and run SQL”.



NOTE: Before configuring SQL code, have created script and was running for multiple user to login and logout for many time.  When you do save and run SQL query you can see the results in Real time analytics tab.



The value 002E000A in the below SQL query is the event ID for Login consumed event which was enabled earlier in admin console IDP cluster auditing and logging page. If you have configured any other event to analyze, use the corresponding event ID.  The event ID can be found in NIDP log file for Identity Server.




CREATE OR REPLACE STREAM "DESTINATION_SQL_STREAM" (eventId VARCHAR(128),stringValue2 VARCHAR(256),message VARCHAR(128));

CREATE OR REPLACE PUMP "STREAM_PUMP" AS INSERT INTO "DESTINATION_SQL_STREAM"

 SELECT "eventId","stringValue2","message" FROM "SOURCE_SQL_STREAM_001" WHERE "eventId" like '2E000A%';



The SQL editor has the following tabs:

 

The Source data tab shows an in-application input stream that is mapped to the streaming source. Choose the in-application stream, and you can see data coming in.

 

The Real-time analytics tab shows all the other in-application streams created by your application code.

 

The Destination tab shows the external destination where Kinesis Data Analytics writes the query results. You haven't configured any external destination for your application output yet.




    1. Creating Delivery Stream



Go to Data Analytics -> Choose the Application details button -> Click on connect to delivery stream -> create a new delivery stream (Kinesis Firehose). For creating delivery stream refer the section 6.1. Creating a Kinesis Delivery Stream

 

After successfully creating delivery stream, and connect the delivery stream to the application as shown below.




Figure 5: Creating Delivery Stream Figure 5: Creating Delivery Stream



Figure 6: Create Deliver Stream Figure 6: Creating Deliver Stream



Now, the delivery stream contains processed data and it is ready for creating visualization in Kibana dashboard.

 

5.3. Data visualization using Kibana



    1. Access the Kibana Dashboard. The Kibana url can be found in the Elasticsearch domain “nam-elasticsearch-domain” created earlier.

      Figure7: Elasticsearch Domain Figure 7: Elasticsearch Domain


 

    1. In Kibana -> choose Management tab -> Click “Index patterns” -> Create new index pattern -> search “realtim*”. Here realtime is the Amazon Elasticsearch Service destination index which was configured earlier while creating Kinesis Firehose delivery stream. Hence the search results shows index named  realtimev1-2018-w51 (here w51 is the number of week).

      Figure8: Create new index pattern Figure 8: Create new index pattern

 

    1. Once index patterns created, Create a visualization

      Figure9: Create a visualization Figure 9: Create a visualization

 

    1. Create a vertical bar chart using the index pattern which was created earlier

      Figure10: Create a vertical bar Figure 10: Create a vertical bar


 

    1. Click on “realtime*” to create, configure x-axis



      Figure11 Figure 11:



      Select “Term” -> field from the “STRINGVALUE2.keyword” (currently it holds name of the contract executed by AM)

 

    1. Similarly other visualizations can be created


       Figure 12


 

    1. Once all visualizations are created, create Dashboard

      Figure13: Creating Dashboard Figure 13: Creating Dashboard

 

    1. Click on” Add” and then Add New Visualization, select the visualization to add it in the dashboard and then click on save button in the top.

      Figure14: Add New Visualization Figure 14: Add New Visualization


       Figure 15: Add new visualization



      The final Kibana Dashboard will look like:

      Figure16: Final Dashboard Figure 16: Final Dashboard



 

6. Additional Information


 

This section contains detailed steps for creating a Kinesis Delivery Stream and creating Amazon Elasticsearch Service.

 

6.1. Creating a Kinesis Delivery Stream


 

Step 1: Name delivery stream as “nam-firehose” and choose “nam-event” data stream as source.

Figure17: Step1 Figure 17: Step 1



Step 2: Process records: let be the default configuration click ‘Next’.

Step 3: Choose destination as “Amazon Elasticsearch Service” and create new domain called “nam-elasticsearch-domain” and configure index, index rotating, and configure s3 backup. Note the index configured here is used in data visualization. For creating new Elasticsearch domain refer section 6.2 Creating Amazon Elasticsearch Service

Figure18: Step3 Figure 18: Step 3



Figure19: Step3 continue Figure 19: Step 3 continue



Configuring S3 backup:

Select S3 bucket to store the data. Create a new S3 bucket if you don’t have any.

Figure20: Step3 continue Figure 20: Step 3 continue



Step 4: Configure Settings: Configure buffer conditions and IAM Role

Figure21: Step4 Figure 21: Step 4



Figure22: Step4 continue Figure 22: Step 4 continue



Create new or choose on IAM role*

Figure23: Step4 continue Figure 23: Step 4 continue



Step 5: Review and Create delivery stream.

This completes creating Kinesis Firehose delivery stream.

6.2. Creating Amazon Elasticsearch Service


 

Create a new Elasticsearch domain called “nam-elasticsearch-domain” as per sample domain cluster configurations shown below.

NOTE: The network configuration VPC is recommended, have configured it to Public access. Refer the link on to configure VPC for elastic search https://docs.aws.amazon.com/elasticsearch-service/latest/developerguide/es-vpc.html







Review the information and choose confirm to create. This might take several time to create a new domain. Once domain created successfully you will see the below screen.



NOTE: To access the Kibana dashboard use the link given in Elasticsearch domain that you have created.


 

Labels:

How To-Best Practice
Support Tips/Knowledge Docs
Support Tip
Comment List
Related
Recommended