Skip to content

Webview support in the Zabaan SDK

This document is a guide for developers to understand the implementation of the Zabaan within webviews.

For a standard Android Activity/Fragment based integration, this document can be ignored.

Overview of steps required by developers

  • Calculating X,Y position for finger animation to appear using getComputedPosition() function.
  • Monitor for changes to the screens using doUpdateVisitedHistory.
  • Invoking JS function to get X,Y coordinates.
  • Creating CoordinateInteractionRequest and playing the interaction.
  • Detect if screen change occurs and call Zabaan.getInstance().stopZabaanInteraction(); in the event it is active.
  • Chaining Interaction requests for screen.

If you are looking to implement this within a WebView, there are certain responsibilities that shift to the app developer. They are listed below:

Calculating Finger animation position at runtime.

You will need to inject this function into your javascript code.

function getComputedPosition(id) {
    var els = document.querySelectorAll('*[id]')

    for (i = 0; i < els.length; i++) {

    var el = document.getElementById(id);
    const rect = el.getBoundingClientRect();
    var parentElement = document.getElementById(id).parentElement.nodeName;
    const parentRect = el.getBoundingClientRect();

    const scrollLeft = window.pageXOffset ; 
    const scrollTop = window.pageYOffset ; 

    return {
        centreY: +  + rect.height/2 + scrollTop,
        centreX: rect.left + rect.width / 2 + scrollLeft,

Screen change triggers

In an Activity based architecture, the SDK can detect screen changes. However, in a webview context, it is the responsibility of the app developer to inform the SDK of a change in screen.

webView.setWebViewClient(new WebViewClient() {
    public void doUpdateVisitedHistory(WebView view, String url, boolean isReload) {
        //Detect when the screen changes and set appropriate state. 
        if (url.contains("<link_for_home_page>")) {
            getValuesFromJS("state_for_homepage", "id_of_div_to_point_finger", 0)
        return super.doUpdateVisitedHistory(view, request);

//This function is used to calculate x,y position for the element in html.
private void getValuesFromJS(String divName, String state, Integer indexPosition) {

/* We require a slight delay here because doUpdateVistedHistory gets called as soon as the screen switches. However, navigation within js is handled slightly differently. There is a parent "screen" -- index.html that onLoad, subsequently loads snippets of html as found in main/assets/client/views folder. As a result, if you immediately make a call to get a position by id, you wont find those ids.*/
new android.os.Handler(Looper.getMainLooper()).postDelayed(
    new Runnable() {
        public void run() {
            webview.evaluateJavascript("javascript:getComputedPosition('" + divName + "');", response -> 
                Double x = 0.0;
                Double y = 0.0;
                try {
                    JSONObject jsonObject = new JSONObject(response);
                    x = Double.parseDouble((String) jsonObject.get("centreX"));
                    y = Double.parseDouble((String) jsonObject.get("centreY"));
                } catch (JSONException e) {

                createInteractionRequest(x, y, state, indexPosition);
    }, 150);

Invoking JS function to get X,Y coordinates.

To invoke above defined JS function. we need to create below function in Java class that has WebView implementation.

Creating Interaction Requests

On trigger of the assistant(Use AssistantStateListener to capture event on assistant image), use interaction request to submit to the SDK as depicted in Creating InteractionRequests Section. Identification of the audio file will be a combination of the screen name and the index. This management will need to be done in the Activity that has Webview. Once the x,y position are calculated and you want Zabaan to play audio while highlighting finger, You have to create CoordinateInteractionRequest.

Below is an example code to create CoordinateInteractionRequest and submitting it to Zabaan

public void createInteractionRequest(double x, double y, String state, Integer index) {
        DisplayMetrics metrics = new DisplayMetrics();
        float logicalDensity = metrics.density;
        int myX = (int) (x/logicalDensity);
        int myY = (int) (y/logicalDensity);

        //converting to dp as interaction request positions finger using dp based values
        CoordinateInteractionRequest cRequest = new CoordinateInteractionRequest.Builder()
        .setIndex(index) // Index value from Zabaan CMS
        .setX(x) // X value for finger animation
        .setY(y) // Y value for finger animation
        .setState("state") //state defined over CMS for the message to be played

        //This function will play audio while showing finger animation at above given X,Y position. 

Chaining Interaction requests for screen.

Zabaan provides you with appropriate callbacks for when audio and animations have stopped (i.e. assistantSpeechEnd) or run into error (assistantSpeechError).

The ZabaanSpeakable object inside assistantSpeechEnd has a function getIndex() which give index sent along with CoordinateInteractionRequest for which audio would have finished playing. So, to chain the next CoordinateInteractionRequest you have to keep track of index value.

public void assistantSpeechEnd(ZabaanSpeakable interaction) {

    //remember - base index starts from 0
    int currentInteractionIndex = interaction.getIndex();

    //incrementing index to play next interaction.
    int newInteractionIndex = interaction.getIndex() + 1;  

    if (currentInteractionIndex == 1) {
        if (interaction.getState().equals(<state x>)) {
            // CREATE INTERACTION REQUEST for state x
        } else if (interaction.getState().equals(<state y>)) {
            // CREATE INTERACTION REQUEST for state y

For more detailed information about Zabaan features, please refer to the main document below

Zabaan Integration guide

Back to top