apim_4xx_support_multiple_analytics_publishers APIM manage workflow with multiple roles APIM 3.0.0 per API based subscription workflow Logging internal HTTP requests Log APIM analytics events to a file Monetization and sample with WSO2 API Manager 2.6.0 Share application and subscription among a set of specific groups or roles WSO2 APIM Correlating analytics event with correlationID APIM analytics distinguish production and sandbox traffic APIM 2.x.x analytics internal and analytics tuneup Configure APIM(Next release) Key Manager User stores APIM(Next release) working with key manager DAS 3.x Parse system variables to Spark Context Revoke OAuth application In APIM 2.1.0 Next WSO2 APIM powered by WSO2 Ballerina Configure WSO2 APIM Analytics on Cluster environment Configure WSO2 DAS 3.1.0 for WSO2 APIM 2.0.0 Analytics WSO2 APIM publishing custom statistics WSO2 APIM Error codes Working with WSO2 message tracer Use DAS admin service to query using Spark SQL Configure WSO2 APIM Analytics using XML WSO2 APIM Generating and Retrieving Custom Statistics Understanding WSO2 APIM Statistics Model Publishing WSO2 APIM 1.10.x Runtime Statistics to DAS with RDBMS Publishing_APIM_1100_Runtime_Statistics_to_DAS Aggregate functions with WSO2 DAS REST API Create a cApp for WSO2 DAS Debugging WSO2 Products using OSGI console. Publishing APIM Runtime Statistics to DAS Deploy cApp on WSO2 DAS How to configure and start the Accumulo minicluster How to setup DNS server on Ubuntu and Ubuntu server How to use Java Reflection how to install apache web server on ubuntu and ubuntu server How to install Mail server on Ubuntu and Ubuntu server How to install squirrelmail webmail client on Ubuntu and Ubuntu Server Pass and return String value to JNI method Pass and return numeric value to JNI method Calling a C Function from the Java Programming Language using JNI AXIS 2 Sample web service Client with maven and eclipse How to setup AXIS 2 with Apache Tomcat AXIS 2 Sample web service with maven and eclipse Robot framework Sample with Selenium Robot framework Custom Library Sample Behaviour-Driven Development with JBehave and Eclipse Play Audio with Netbeans and linking with LibVLC Implement LibVLC based player with QT-part2 Simple Audio playing sample with LibVLC How to install LibVLC on Ubuntu Implement LibVLC based player with QT-part1
Use DAS admin service to query using Spark SQL
  1. Introduction

    WSO2 DAS is wso2 analytics platform which is capable of publishing an event to DAS and come up with summarized data for your environment. This document illustrates how to use DAS admin service to retrieve data by querying spark query on wso2 DAS. The common way of retrieving data from DAS is using REST API and using RDBMS and other event types like SMS etc. The most advantages with this method are which is very efficient when retrieving data from das and possible to query very complex spark querying that apache Lucene does not support. Also for this, we no need another extra Datasource like RDBMS to get data and we can retrieve data directly from DAS internal tables.

  2. First, create new Maven project and adding following dependencies to project.
    <dependencies>
    	<dependency>
    		<groupId>org.wso2.carbon.analytics</groupId>
    		<artifactId>org.wso2.carbon.analytics.spark.stub</artifactId>
    		<version>${carbon.analytics.version}</version>
    	</dependency>
    	<dependency>
    		<groupId>org.json</groupId>
    		<artifactId>json</artifactId>
    		<version>20160212</version>
    	</dependency>
    	<dependency>
    		<groupId>com.google.code.gson</groupId>
    		<artifactId>gson</artifactId>
    		<version>2.2.4</version>
    	</dependency>
    </dependencies>
    <properties>
    	<carbon.analytics.version>1.0.6-alpha3</carbon.analytics.version>
    </properties>
    <repositories>
    	<repository>
    		<id>wso2-nexus</id>
    		<name>WSO2 internal Repository</name>
    		<url>http://maven.wso2.org/nexus/content/groups/wso2-public/</url>
    		<releases>
    			<enabled>true</enabled>
    			<updatePolicy>daily</updatePolicy>
    			<checksumPolicy>ignore</checksumPolicy>
    		</releases>
    	</repository>
    </repositories>
    
  3. The related admin service to execute SQL is AnalyticsProcessorAdminService.lets create adminservice stub instance like this.
    		String DAS_USERNAME = "admin";
    		String DAS_PASSWORD = "admin";
    		String DAS_ANALYTICS_PROCESSOR_SERVICE_URL = "https://localhost:9444/services/AnalyticsProcessorAdminService";
    		// initiating admin service stub for executing script
    		AnalyticsProcessorAdminServiceStub stub = new AnalyticsProcessorAdminServiceStub(
    				DAS_ANALYTICS_PROCESSOR_SERVICE_URL);
    		ServiceClient client = stub._getServiceClient();
    		Options client_options = client.getOptions();
    		HttpTransportProperties.Authenticator authenticator = new HttpTransportProperties.Authenticator();
    		authenticator.setUsername(DAS_USERNAME);
    		authenticator.setPassword(DAS_PASSWORD);
    		authenticator.setPreemptiveAuthentication(true);
    		client_options.setProperty(
    				org.apache.axis2.transport.http.HTTPConstants.AUTHENTICATE,
    				authenticator);
    		client.setOptions(client_options);
    
    
  4. For this example, here it used the wso2 APIM summary data table with the following table schema. you can deploy APIM analytics CApp and generate data on that table. Or you can come up with your own table and
    prepare query based on that.
  5. To execute the SQL query and get the response use below code
    AnalyticsQueryResultDto dto;
    String executeQeury = "select api, version, userId, apiPublisher, year, month, day, sum(total_request_count) as total_count from dataTable"
    				+ " group by api, version, userId, apiPublisher, year, month, day";
    dto = stub.executeQuery(executeQeury);
    
    
  6. parsing data.
    private static Map<String, Object> convertToMap(AnalyticsQueryResultDto dto)
    			throws JSONException {
    	Map<String, Object> map = new HashMap<String, Object>();
    	// JSONArray payloadData = new
    	// JSONObject(stream).getJSONArray("payloadData");
    
    	String[] cols = dto.getColumnNames();
    	AnalyticsRowResultDto[] rows = dto.getRowsResults();
    
    	for (int j = 0; j < rows.length; j++) {
    		AnalyticsRowResultDto val = rows[j];
    		for (int i = 0; i < cols.length; i++) {
    			map.put(cols[i], val.getColumnValues()[i]);
    		}
    	}
    	return map;
    }
    
  7. Complete example
    package com.rukspot.sample.wso2.das;
    
    import java.util.HashMap;
    import java.util.Map;
    
    import org.apache.axis2.client.Options;
    import org.apache.axis2.client.ServiceClient;
    import org.apache.axis2.transport.http.HttpTransportProperties;
    import org.json.JSONException;
    import org.wso2.carbon.analytics.spark.admin.stub.AnalyticsProcessorAdminServiceStub;
    import org.wso2.carbon.analytics.spark.admin.stub.AnalyticsProcessorAdminServiceStub.AnalyticsQueryResultDto;
    import org.wso2.carbon.analytics.spark.admin.stub.AnalyticsProcessorAdminServiceStub.AnalyticsRowResultDto;
    
    import com.google.gson.Gson;
    
    public class ExecuteQuery {
    
    	public static void main(String[] args) throws Exception {
    		System.setProperty("javax.net.ssl.trustStore",
    				"/home/rukshan/wso2-jks/wso2carbon.jks");
    		System.setProperty("javax.net.ssl.trustStorePassword", "wso2carbon");
    
    		// TODO Auto-generated method stub
    		executeScriptOnDas();
    	}
    
    	public static void executeScriptOnDas() throws Exception {
    		System.out.println("Starting APIM STAT Script");
    		String DAS_USERNAME = "admin";
    		String DAS_PASSWORD = "admin";
    		String DAS_ANALYTICS_PROCESSOR_SERVICE_URL = "https://localhost:9444/services/AnalyticsProcessorAdminService";
    		// initiating admin service stub for executing script
    		AnalyticsProcessorAdminServiceStub stub = new AnalyticsProcessorAdminServiceStub(
    				DAS_ANALYTICS_PROCESSOR_SERVICE_URL);
    		ServiceClient client = stub._getServiceClient();
    		Options client_options = client.getOptions();
    		HttpTransportProperties.Authenticator authenticator = new HttpTransportProperties.Authenticator();
    		authenticator.setUsername(DAS_USERNAME);
    		authenticator.setPassword(DAS_PASSWORD);
    		authenticator.setPreemptiveAuthentication(true);
    		client_options.setProperty(
    				org.apache.axis2.transport.http.HTTPConstants.AUTHENTICATE,
    				authenticator);
    		client.setOptions(client_options);
    
    		String createTempTable = "create temporary table dataTable USING CarbonAnalytics OPTIONS(tableName \"API_REQUEST_SUMMARY\")";
    		stub.executeQuery(createTempTable);
    
    		String executeQeury = "select api, version, userId, apiPublisher, year, month, day, sum(total_request_count) as total_count from dataTable"
    				+ " group by api, version, userId, apiPublisher, year, month, day";
    		dto = stub.executeQuery(executeQeury);
    
    		Gson g = new Gson();
    		System.out.println(g.toJson(dto.getColumnNames()));
    		System.out.println(g.toJson(dto.getRowsResults()));
    
    		Map<String, Object> map = convertToMap(dto);
    		System.out.println(g.toJson(map));
    	}
    
    	private static Map<String, Object> convertToMap(AnalyticsQueryResultDto dto)
    			throws JSONException {
    		Map<String, Object> map = new HashMap<String, Object>();
    		// JSONArray payloadData = new
    		// JSONObject(stream).getJSONArray("payloadData");
    
    		String[] cols = dto.getColumnNames();
    		AnalyticsRowResultDto[] rows = dto.getRowsResults();
    
    		for (int j = 0; j < rows.length; j++) {
    			AnalyticsRowResultDto val = rows[j];
    			for (int i = 0; i < cols.length; i++) {
    				map.put(cols[i], val.getColumnValues()[i]);
    			}
    		}
    		return map;
    	}
    }
    
    

Add Comment

* Required information
1000
Powered by Commentics

Comments (0)

No comments yet. Be the first!