Cybersecurity
DevOps Cloud
IT Operations Cloud
Public cloud deployment is supported by Access Manager version 4.4 SP1 or later and this capability of AM can be utilized to integrate with different managed services provided by the public cloud vendors.
Here is one such example of leveraging AWS managed services to develop an analytic solution for Access Manager which is deployed on AWS.
This article covers use-cases, Architecture, a sample scenario, and how to analyze and visualize the Access Manager Audit Events in Kibana Analytics Dashboard with help of various AWS Managed services.
The Figure 1 architecture diagram shows how to analyze and visualize the Access Manager Event logs in Kibana Analytics Dashboard using AWS Managed services.
How to configure AM and various AWS managed services for this sample scenario are described in the below sections.
There are three major steps involved to create this data visualization in AWS.
LOGDEST=console to LOGDEST=syslog
FORMAT=json to FORMAT=CSV
template ForwardFormat,"%syslogtag:1:32%%msg:::sp-if-no-1st-sp%%msg%\n"
local0.* -/var/log/NAM_audits.log;ForwardFormat
service aws-kinesis-agent restart
systemctl rsyslog restart
/etc/novell-ac restart
/etc/novell-idp restart
NOTE: The AM Admin Console syslog server is used for collecting the event logs, hence have configured admin console IP address syslog server port. In case if different syslog server is used configure the details accordingly.
The Kinesis Data Analytics applications continuously read and process data from streaming sources in real-time. This application uses SQL to process incoming data stream and produce the output. The Kinesis Data Analytics application does three major operations.
The below section 5.2.1 contains how to configure all these three major operations of Kinesis Data Analytics Application using AWS console.
First, create a new Kinesis Data Analytics Application in AWS Management Console, by choosing Kinesis Service -> Data Analytics -> Create application -> specify application name “nam-analytics”. Once the application "nam-analytics" created successfully, configure the three major operations are 1. Configure Kinesis Data Stream, 2. SQL code, 3. Kinesis Delivery Stream.
Click on Connect Streaming Data -> Create a new Kinesis stream called “nam-events” as shown in Figure 4-> click Discover schema -> save and continue. This completes connecting streaming data.
NOTE: The Kinesis stream name must be same as it is configured earlier in Kinesis Agent configuration (agent.json) file. Have configured number of shards as 1. Based on your requirement configure the number of shards.
Write SQL query to filter login consumed event related data in Kinesis Stream. To create SQL query, Click on “Go to SQL editor” in Application -> Use the below SQL query and click on “save and run SQL”.
NOTE: Before configuring SQL code, have created script and was running for multiple user to login and logout for many time. When you do save and run SQL query you can see the results in Real time analytics tab.
The value 002E000A in the below SQL query is the event ID for Login consumed event which was enabled earlier in admin console IDP cluster auditing and logging page. If you have configured any other event to analyze, use the corresponding event ID. The event ID can be found in NIDP log file for Identity Server.
CREATE OR REPLACE STREAM "DESTINATION_SQL_STREAM" (eventId VARCHAR(128),stringValue2 VARCHAR(256),message VARCHAR(128));
CREATE OR REPLACE PUMP "STREAM_PUMP" AS INSERT INTO "DESTINATION_SQL_STREAM"
SELECT "eventId","stringValue2","message" FROM "SOURCE_SQL_STREAM_001" WHERE "eventId" like '2E000A%';
The SQL editor has the following tabs:
The Source data tab shows an in-application input stream that is mapped to the streaming source. Choose the in-application stream, and you can see data coming in.
The Real-time analytics tab shows all the other in-application streams created by your application code.
The Destination tab shows the external destination where Kinesis Data Analytics writes the query results. You haven't configured any external destination for your application output yet.
Go to Data Analytics -> Choose the Application details button -> Click on connect to delivery stream -> create a new delivery stream (Kinesis Firehose). For creating delivery stream refer the section 6.1. Creating a Kinesis Delivery Stream
After successfully creating delivery stream, and connect the delivery stream to the application as shown below.
Now, the delivery stream contains processed data and it is ready for creating visualization in Kibana dashboard.
This section contains detailed steps for creating a Kinesis Delivery Stream and creating Amazon Elasticsearch Service.
Step 1: Name delivery stream as “nam-firehose” and choose “nam-event” data stream as source.
Step 2: Process records: let be the default configuration click ‘Next’.
Step 3: Choose destination as “Amazon Elasticsearch Service” and create new domain called “nam-elasticsearch-domain” and configure index, index rotating, and configure s3 backup. Note the index configured here is used in data visualization. For creating new Elasticsearch domain refer section 6.2 Creating Amazon Elasticsearch Service
Configuring S3 backup:
Select S3 bucket to store the data. Create a new S3 bucket if you don’t have any.
Step 4: Configure Settings: Configure buffer conditions and IAM Role
Create new or choose on IAM role*
Step 5: Review and Create delivery stream.
This completes creating Kinesis Firehose delivery stream.
Create a new Elasticsearch domain called “nam-elasticsearch-domain” as per sample domain cluster configurations shown below.
NOTE: The network configuration VPC is recommended, have configured it to Public access. Refer the link on to configure VPC for elastic search https://docs.aws.amazon.com/elasticsearch-service/latest/developerguide/es-vpc.html
Review the information and choose confirm to create. This might take several time to create a new domain. Once domain created successfully you will see the below screen.
NOTE: To access the Kibana dashboard use the link given in Elasticsearch domain that you have created.