APIM manage workflow with multiple roles APIM 3.0.0 per API based subscription workflow Logging internal HTTP requests Log APIM analytics events to a file Monetization and sample with WSO2 API Manager 2.6.0 Share application and subscription among a set of specific groups or roles WSO2 APIM Correlating analytics event with correlationID APIM analytics distinguish production and sandbox traffic APIM 2.x.x analytics internal and analytics tuneup Configure APIM(Next release) Key Manager User stores APIM(Next release) working with key manager DAS 3.x Parse system variables to Spark Context Revoke OAuth application In APIM 2.1.0 Next WSO2 APIM powered by WSO2 Ballerina Configure WSO2 APIM Analytics on Cluster environment Configure WSO2 DAS 3.1.0 for WSO2 APIM 2.0.0 Analytics WSO2 APIM publishing custom statistics WSO2 APIM Error codes Working with WSO2 message tracer Use DAS admin service to query using Spark SQL Configure WSO2 APIM Analytics using XML WSO2 APIM Generating and Retrieving Custom Statistics Understanding WSO2 APIM Statistics Model Publishing WSO2 APIM 1.10.x Runtime Statistics to DAS with RDBMS Publishing_APIM_1100_Runtime_Statistics_to_DAS Aggregate functions with WSO2 DAS REST API Create a cApp for WSO2 DAS Debugging WSO2 Products using OSGI console. Publishing APIM Runtime Statistics to DAS Deploy cApp on WSO2 DAS How to configure and start the Accumulo minicluster How to setup DNS server on Ubuntu and Ubuntu server How to use Java Reflection how to install apache web server on ubuntu and ubuntu server How to install Mail server on Ubuntu and Ubuntu server How to install squirrelmail webmail client on Ubuntu and Ubuntu Server Pass and return String value to JNI method Pass and return numeric value to JNI method Calling a C Function from the Java Programming Language using JNI AXIS 2 Sample web service Client with maven and eclipse How to setup AXIS 2 with Apache Tomcat AXIS 2 Sample web service with maven and eclipse Robot framework Sample with Selenium Robot framework Custom Library Sample Behaviour-Driven Development with JBehave and Eclipse Play Audio with Netbeans and linking with LibVLC Implement LibVLC based player with QT-part2 Simple Audio playing sample with LibVLC How to install LibVLC on Ubuntu Implement LibVLC based player with QT-part1
DAS 3.x Parse system variables to Spark Context
  1. Introduction

    In some cases, dynamic values need to be passed to the spark context and to do some process from spark. So that the most convenient way is passing it as a system variable using -D command.

    But Spark Context is initiated indifferently in a single node and clustered environment. So that following approach can be taken to pass them to the spark context.

  2. Single node environment

    Provide system variables when starting the DAS carbon

    server
    Ex: bin/wso2server.sh -Dkey1=value1 -Dkey2=value2
    
    
  3. Distributed Environment

    In Distributed environment Spark is spawn as separate JVM and provide properties to the Carbon server will not be passed to the Spark JVM. Hence you need to pass them to the Spark intentionally. To do that, open

    <DAS_HOME> /repository/conf/analytics/spark/spark-defaults.conf and add the following property.
    spark.executor.extraJavaOptions -Dkey1=value1 -Dkey2=value2
    
    

    spark-defaults.conf is used to provided external properties and override the default properties in the Spark context. With the above property, their properties passed to every executor that spawn by the parent JVM.

    Also note that these configurations will work on other DAS flavored server like APIM Analytics, EI analytics Etc.

  4. References
    1. https://spark.apache.org/docs/latest/configuration.html

Add Comment

* Required information
1000
Powered by Commentics

Comments (0)

No comments yet. Be the first!