apim_4xx_support_multiple_analytics_publishers APIM manage workflow with multiple roles APIM 3.0.0 per API based subscription workflow Logging internal HTTP requests Log APIM analytics events to a file Monetization and sample with WSO2 API Manager 2.6.0 Share application and subscription among a set of specific groups or roles WSO2 APIM Correlating analytics event with correlationID APIM analytics distinguish production and sandbox traffic APIM 2.x.x analytics internal and analytics tuneup Configure APIM(Next release) Key Manager User stores APIM(Next release) working with key manager DAS 3.x Parse system variables to Spark Context Revoke OAuth application In APIM 2.1.0 Next WSO2 APIM powered by WSO2 Ballerina Configure WSO2 APIM Analytics on Cluster environment Configure WSO2 DAS 3.1.0 for WSO2 APIM 2.0.0 Analytics WSO2 APIM publishing custom statistics WSO2 APIM Error codes Working with WSO2 message tracer Use DAS admin service to query using Spark SQL Configure WSO2 APIM Analytics using XML WSO2 APIM Generating and Retrieving Custom Statistics Understanding WSO2 APIM Statistics Model Publishing WSO2 APIM 1.10.x Runtime Statistics to DAS with RDBMS Publishing_APIM_1100_Runtime_Statistics_to_DAS Aggregate functions with WSO2 DAS REST API Create a cApp for WSO2 DAS Debugging WSO2 Products using OSGI console. Publishing APIM Runtime Statistics to DAS Deploy cApp on WSO2 DAS How to configure and start the Accumulo minicluster How to setup DNS server on Ubuntu and Ubuntu server How to use Java Reflection how to install apache web server on ubuntu and ubuntu server How to install Mail server on Ubuntu and Ubuntu server How to install squirrelmail webmail client on Ubuntu and Ubuntu Server Pass and return String value to JNI method Pass and return numeric value to JNI method Calling a C Function from the Java Programming Language using JNI AXIS 2 Sample web service Client with maven and eclipse How to setup AXIS 2 with Apache Tomcat AXIS 2 Sample web service with maven and eclipse Robot framework Sample with Selenium Robot framework Custom Library Sample Behaviour-Driven Development with JBehave and Eclipse Play Audio with Netbeans and linking with LibVLC Implement LibVLC based player with QT-part2 Simple Audio playing sample with LibVLC How to install LibVLC on Ubuntu Implement LibVLC based player with QT-part1
Publishing_APIM_1100_Runtime_Statistics_to_DAS

APIM 1.10.x is default support to publishing Statistics to DAS and no longer support for the BAM. The previous version of APIM used BAM and BAM put the summarized data on a separate RDBMS. Then APIM fetches data from that RDBMS. But with DAS, we have efficient data sharing from DAS directly to APIM using DAS REST API. Because of that APIM 1.10.x no need to configure RDBMS to generate statistics. However APIM 1.10.x still support RDBMS like APIM 1.9.x to generate summarized data. APIM Default support generates summarized data without configuring RDBMS. So this blog is for configuring APIM and DAS without using RDBMS. Still, you want to use RDBMS, it is still possible and you can find a blog about it in future.

  1. Prerequisites
      Follow are the prerequisites and link to download them.
    • wso2 API Manager 1.10.0 from here
    • wso2 DAS 3.0.0 from here
  2. Configure WSO2 DAS

    If APIM and DAS run on the same machine, increase the default service port of DAS by setting offset value in <DAS_HOME>/repository/conf/carbon.xml

      <Offset>1</Offset>
      
  3. Configure WSO2 API Manager
    • Since APIM 1.10.x not released yet, you can build it from source.
    • Start APIM
    • go to admin-dashboard using https://localhost:9443/admin-dashboard/
    • log in and click Configure Analytics under the settings section
    • select enable combo box and will appear setting to configure analytics
    • set the Event Receiver Configurations according to the DAS
    • Then click the Add URL group button to save it.
    • Enter Data Analyzer Configurations according to the DAS
    • Once done click save

    DAS configuration overview

  4. Invoke Sample API and Get the Statistics
      • Let's invoke an API to generate Traffic and see the Statistics

    Deploy Sample Wheather API

      • Deploy sample WeatherAPI by login to the APIM Publisher

    Sample Wheather API

      • Then login to the Store and subscribe to API you created
      • Using Store API console or using Curl invoke the API

    Invoke the API

      • Then wait for a few minutes(‹ 5 mins) to generate Analytics
      • Then Navigate to Publisher Statistics Section and click on API Usage

    WeatherAPI usage

  5. Data Purge (Optional)

    Data purge is one option to remove historical data in DAS. Since DAS does not allow to delete the DAS table data or Table deletion this option is very important. With data purging, you can achieve high performance on data analyzing without removing analyzed summary data.

    Here we purge data only on stream data fired by APIM. These data are contained in the following tables.

      ORG_WSO2_APIMGT_STATISTICS_DESTINATION
      ORG_WSO2_APIMGT_STATISTICS_FAULT
      ORG_WSO2_APIMGT_STATISTICS_REQUEST
      ORG_WSO2_APIMGT_STATISTICS_RESPONSE
      ORG_WSO2_APIMGT_STATISTICS_WORKFLOW
      ORG_WSO2_APIMGT_STATISTICS_THROTTLE
    

    Make sure not to purge data other than an above table. it will result in vanishing your summarized historical data.

    There is two ways to purge data in DAS.

      1. Using admin console
        • Go the Data-explorer and select above at a time.
        • Then click schedule data purge button.
        • Then set the time and days you need to purge.
        • Do this all of the above tables and wait for the data purging.

    Data Purge Dialog box

      1. Global method
      2. Note that this will affect all the tenants
        • Open the <DAS_HOME>/repository/conf/analytics/analytics-config.xml
        • change content of <analytics-data-purging> tag as below
      <analytics-data-purging>
        <!-- Below entry will indicate purging is enable or not. If user wants to enable data purging for cluster then this property need to be enable in all nodes -->
        <purging-enable>true</purging-enable>
        <cron-expression>0 0 12 * * ?</cron-expression>
        <!-- Tables that need include to purging. Use regex expression to specify the table name that need include to purging.-->
        <purge-include-table-patterns>
          <table>.*</table>
          <!--<table>.*jmx.*</table>-->
          </purge-include-table-patterns>
        <!-- All records that insert before the specified retention time will be eligible to purge -->
        <data-retention-days>365</data-retention-days>
      </analytics-data-purging>
    

Add Comment

* Required information
1000
Powered by Commentics

Comments (0)

No comments yet. Be the first!