apim_4xx_support_multiple_analytics_publishers APIM manage workflow with multiple roles APIM 3.0.0 per API based subscription workflow Logging internal HTTP requests Log APIM analytics events to a file Monetization and sample with WSO2 API Manager 2.6.0 Share application and subscription among a set of specific groups or roles WSO2 APIM Correlating analytics event with correlationID APIM analytics distinguish production and sandbox traffic APIM 2.x.x analytics internal and analytics tuneup Configure APIM(Next release) Key Manager User stores APIM(Next release) working with key manager DAS 3.x Parse system variables to Spark Context Revoke OAuth application In APIM 2.1.0 Next WSO2 APIM powered by WSO2 Ballerina Configure WSO2 APIM Analytics on Cluster environment Configure WSO2 DAS 3.1.0 for WSO2 APIM 2.0.0 Analytics WSO2 APIM publishing custom statistics WSO2 APIM Error codes Working with WSO2 message tracer Use DAS admin service to query using Spark SQL Configure WSO2 APIM Analytics using XML WSO2 APIM Generating and Retrieving Custom Statistics Understanding WSO2 APIM Statistics Model Publishing WSO2 APIM 1.10.x Runtime Statistics to DAS with RDBMS Publishing_APIM_1100_Runtime_Statistics_to_DAS Aggregate functions with WSO2 DAS REST API Create a cApp for WSO2 DAS Debugging WSO2 Products using OSGI console. Publishing APIM Runtime Statistics to DAS Deploy cApp on WSO2 DAS How to configure and start the Accumulo minicluster How to setup DNS server on Ubuntu and Ubuntu server How to use Java Reflection how to install apache web server on ubuntu and ubuntu server How to install Mail server on Ubuntu and Ubuntu server How to install squirrelmail webmail client on Ubuntu and Ubuntu Server Pass and return String value to JNI method Pass and return numeric value to JNI method Calling a C Function from the Java Programming Language using JNI AXIS 2 Sample web service Client with maven and eclipse How to setup AXIS 2 with Apache Tomcat AXIS 2 Sample web service with maven and eclipse Robot framework Sample with Selenium Robot framework Custom Library Sample Behaviour-Driven Development with JBehave and Eclipse Play Audio with Netbeans and linking with LibVLC Implement LibVLC based player with QT-part2 Simple Audio playing sample with LibVLC How to install LibVLC on Ubuntu Implement LibVLC based player with QT-part1
Create a cApp for WSO2 DAS
  1. Introduction

    DAS Capp is used to defining the artifacts, and ship to DAS as an archive. There are about 7 types of artifacts. Here I will show how to create a capp for analyzing stream data. For that, we need around 4 artifacts type. Those are,

    Event Streams, Event receivers, Event stores, and Analytics Scripts.

    A simple example for this scenario is said wso2 APIM need to get statistic data for their API manager usage. In that case, APIM published the event stream to the DAS and DAS do the analytics and provide the summarized data. To support this scenario the simplest thing would be defined the required artifacts as a capp and providing to the DAS. The Capp contain the definition for stream and the data formats to process.

  2. Type of Artifacts
    • Event Streams-define the data format of the streaming data
    • Event stores - format and the schema required to persistent the streaming data
    • Event receivers, - binding the streams to the stores
    • Analytics Scripts.- analytics script used to summarize streams data
  3. Create Event Stream Artifact

    let's define the stream format as below and Stream definition is saved in JSON format. For that, you need to provide the unique name for the stream name. There is two section named metadata and payload data. metadata section leaves it as default and you modify the payload section as you wish. It contains the actual data format you are publishing.

    create org.wso2.sample.stream_1.0.0.json file and put the following content.

    {
    "streamId": "org.wso2.capp.sample:1.0.0",
    "name": "org.wso2.capp.sample",
    "version": "1.0.0",   "nickName": "",
    "description": "sample data stream",
    "metaData": [],   "correlationData": [],
    "payloadData": [
    			{ "name": "api",	"type": "STRING"     },
    			{ "name": "user",	"type": "STRING"     },
    			{ "name": "application",	"type": "STRING"     },
    			{ "name": "request",	"type": "STRING"     }
    		]
    }
    
    

    Next, this would be defining this as an artifact. To form stream artifacts moved the above JSON file to the directory called EventStream_Sample_1.0.0. Then you need to add artifact.xml to define the artifact information inside the directory you created.

    add the following to the artifact.xml

    <?xml version="1.0" encoding="UTF-8"?>
    	<artifact name="Eventstream_sample" version="1.0.0" type="event/stream" serverRole="DataAnalyticsServer">
    	<file>org.wso2.sample.stream_1.0.0.json</file>
    </artifact>
    

    make sure the correct artifact name and filename are defined.

  4. Create Event Store Artifact

    Then define how to persist data to internal DAS Tables. Next, we create artifacts for the store. create another directory called Eventstore_sample_1.0.0. inside that, we create ORG_WSO2_CAPP_SAMPLE.xml and artifact.xml as below.

    <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <EventStoreConfiguration>
    	<Source>
    		<StreamId>org.wso2.capp.sample:1.0.0</StreamId>
    	</Source>
    
        <TableSchema>
            <ColumnDefinition>
                <Name>api</Name>
                <EnableIndexing>true</EnableIndexing>
                <IsPrimaryKey>false</IsPrimaryKey>
                <EnableScoreParam>false</EnableScoreParam>
                <Type>STRING</Type>
            </ColumnDefinition>
            <ColumnDefinition>
                <Name>user</Name>
                <EnableIndexing>true</EnableIndexing>
                <IsPrimaryKey>false</IsPrimaryKey>
                <EnableScoreParam>false</EnableScoreParam>
                <Type>STRING</Type>
            </ColumnDefinition>
            <ColumnDefinition>
                <Name>application</Name>
                <EnableIndexing>true</EnableIndexing>
                <IsPrimaryKey>false</IsPrimaryKey>
                <EnableScoreParam>false</EnableScoreParam>
                <Type>STRING</Type>
            </ColumnDefinition>
            <ColumnDefinition>
                <Name>request</Name>
                <EnableIndexing>true</EnableIndexing>
                <IsPrimaryKey>false</IsPrimaryKey>
                <EnableScoreParam>false</EnableScoreParam>
                <Type>INTEGER</Type>
            </ColumnDefinition>
        </TableSchema>
    </EventStoreConfiguration>
    

    ORG_WSO2_CAPP_SAMPLE.xml has column definition tags. which used to define the Table definition of the Table create for the Stream. There you can find 5 child elements as below. use them as appropriate

    	‹Name›consumerKey‹/Name› column name
    	‹EnableIndexing›true‹/EnableIndexing› need indexing, if you hope to use REST api this should be true
    	‹IsPrimaryKey›false‹/IsPrimaryKey› is it primary key
    	‹EnableScoreParam›false‹/EnableScoreParam›
    	‹Type›STRING‹/Type›
    

    The data type of the column, it supports primitive type and a special type called facet. Use facet if u are hoping to use drill-down operations.

    Then define the artifact.xml as similar to this

    <?xml version="1.0" encoding="UTF-8"?>
    <artifact name="Eventstore_sample" version="1.0.0" type="analytics/eventstore" serverRole="DataAnalyticsServer">
        <file>ORG_WSO2_CAPP_SAMPLE.xml</file>
    </artifact>
    
  5. Create Event Receiver Artifact

    Next, need to bind the stream to the table using receivers. Create another directory Eventreceiver_sample_1.0.0 for the receiver. Then create EventReceiver_sample.xml and artifact.xml, and put the following content and as below

    EventReceiver_sample.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <eventReceiver name="EventReceiver_sample" statistics="disable" trace="disable" xmlns="http://wso2.org/carbon/eventreceiver">
    	<from eventAdapterType="wso2event">
        		<property name="events.duplicated.in.cluster">false</property>
    	</from>
    
        <mapping customMapping="disable" type="wso2event"/>
        <to streamName="org.wso2.capp.sample" version="1.0.0"/>
    </eventReceiver>
    
    artifact.xml
    <?xml version="1.0" encoding="UTF-8"?>
    <artifact name="Eventreceiver_sample" version="1.0.0" type="event/receiver" serverRole="DataAnalyticsServer">
    	<file>EventReceiver_sample.xml</file>
    </artifact>
    
  6. Create Analytic Script Artifact

    At last, you need to define the spark scripts which do the correct summarization. Now we have to define the artifact for the script. Create Sparkscripts_1.0.0 directory and create sample_script.xml and artifact.xml as below

    sample_script.xml
    <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <Analytics>
    	<Name>sample_script</Name>
            <Script>
            	create temporary table sampleData USING CarbonAnalytics OPTIONS(tableName "ORG_WSO2_CAPP_SAMPLE");
    		select * from sampleData;
            </Script>
    
            <CronExpression>0 0/5 * 1/1 * ? *</CronExpression>
    </Analytics>
    

    artifact.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <artifact name="Sparkscripts" version="1.0.0" type="analytics/spark" serverRole="DataAnalyticsServer">
    	<file>sample_script.xml</file>
    </artifact>
    
  7. Create Deployable Artifact

    Creating achieve from the artifacts we have defined all the artifacts for the capp. Now we create the deployable artifact for the all the artifacts. Create the artifacts.xml same directory that all other artifacts directory contains. Then define the artifacts.xml as below.

    artifacts.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <artifacts>
    	<artifact name="SAMPLE_CAPP" version="v1.0.0" type="carbon/application">
        		<dependency artifact="Eventstream_sample" version="1.0.0" include="true" serverRole="DataAnalyticsServer"/>
            	<dependency artifact="Eventstore_sample" version="1.0.0" include="true" serverRole="DataAnalyticsServer"/>
            	<dependency artifact="Eventreceiver_sample" version="1.0.0" include="true" serverRole="DataAnalyticsServer"/>
            	<dependency artifact="Sparkscripts" version="1.0.0" include="true" serverRole="DataAnalyticsServer"/>
    	</artifact>
    </artifacts>
    

    capp is the normal .zip file with .car extension. To create archive select all the artifacts and zip it. In the zip, it should contain all the artifacts directory and artifacts.xml at the root level of the zip. Rename archive extension from .zip to .car after zipped.

  8. Deploy the Artifact Next you can deploy the archive from the DAS web console. For more about capp deployment follow the this blog
  9. Download the sample SAMPLE_CAPP.car
  10. Download the sample from here

Add Comment

* Required information
1000
Powered by Commentics

Comments (0)

No comments yet. Be the first!