Skip to content

External Data Sources


Through DataFlux Func, various data sources such as MySQL can be quickly integrated into TrueWatch, enabling seamless data querying and visualization integration.

Features

  • Native Query: Directly use the native query statements of the data source in charts without any additional conversion;
  • Data Protection: Based on data security and privacy considerations, all data source information is stored only in your local Func, not on the platform, ensuring information security and preventing data leaks;
  • Custom Management: Easily add and manage various external data sources according to actual needs;
  • Real-time Data: Directly connect to external data sources, obtain real-time data, and make immediate decisions.

Two Integration Paths

Adding Data Sources on the TrueWatch Side

This involves directly adding or viewing the connected DataFlux Func in Extensions, and further managing all connected external data sources.

Note

This method is more user-friendly compared to the second path and is recommended.

  1. Drop down and select DataFlux Func;
  2. Select the supported data source type;
  3. Define connection properties, including ID, data source title, associated host, port, database, user, and password.
  4. Test the connection as needed;
  5. Save.

Using Func to Query External Data Sources

Note

The term "external data sources" here has a broad definition, including common external data storage systems (such as MySQL, Redis, etc.) and third-party systems (e.g., the TrueWatch console).

Prerequisites

You need to download the corresponding installation package and deploy the Func platform by following the Quick Start guide.

After deployment, wait for the initialization to complete and log in to the platform.

Associating Func with TrueWatch

Through connectors, developers can connect to the TrueWatch system.

Go to Development > Connectors > Add Connector page:

  1. Select the connector type;
  2. Customize the ID of the connector;
  3. Add a title. This title will be displayed in the TrueWatch workspace;
  4. Optionally, enter a description for the connector;
  5. Select the TrueWatch node;
  6. Add API Key ID and API Key;
  7. Optionally, test the connectivity;
  8. Save.

After association, you can query data sources in the Func platform in the following two ways:

How to Get API Key
  1. Go to TrueWatch Workspace > Management > API Key Management;
  2. Click Create Key on the right side of the page.
  3. Enter a name;
  4. Click Confirm. The system will automatically create an API Key for you, which you can view in the API Key section.

For more details, refer to API Key Management.

Using Connectors

After successfully adding a connector, you can use the connector ID in scripts to get the corresponding connector object.

Taking the connector above as an example, the code to get the connector object is:

mysql = DFF.CONN('mysql')

Writing Scripts Manually

In addition to using connectors, you can also write functions manually to query data.

Assume that a user has correctly created a MySQL connector (defined with ID mysql), and there is a table named my_table in this MySQL, containing the following data:

id userId username reqMethod reqRoute reqCost createTime
1 u-001 admin POST /api/v1/scripts/:id/do/modify 23 1730840906
2 u-002 admin POST /api/v1/scripts/:id/do/publish 99 1730840906
3 u-003 zhang3 POST /api/v1/scripts/:id/do/publish 3941 1730863223
4 u-004 zhang3 POST /api/v1/scripts/:id/do/publish 159 1730863244
5 u-005 li4 POST /api/v1/scripts/:id/do/publish 44 1730863335
...

Assume that you need to query this table data through a data query function, and the field extraction rules are as follows:

Original Field Extracted As
createTime Time time
reqCost Column req_cost
reqMethod Column req_method
reqRoute Column req_route
userId Tag user_id
username Tag username

Then, the complete reference code is as follows:

  • Data Query Function Example
import json

@DFF.API('Query Data from my_table', category='dataPlatform.dataQueryFunc')
def query_from_my_table(time_range):
    # Get the connector object
    mysql = DFF.CONN('mysql')

    # MySQL Query Statement
    sql = '''
      SELECT
        createTime, userId, username, reqMethod, reqRoute, reqCost
      FROM
        my_table
      WHERE
        createTime     > ?
        AND createTime < ?
      LIMIT 5
    '''

    # Since the time_range unit is milliseconds
    # But the createTime field in MySQL is in seconds, conversion is needed
    sql_params = [
      int(time_range[0] / 1000),
      int(time_range[1] / 1000),
    ]

    # Execute the query
    db_res = mysql.query(sql, sql_params)

    # Convert to DQL-like return result

    # Depending on the tags, multiple data series may need to be generated
    # Use the data series tags as Key to create a mapping table
    series_map = {}

    # Traverse the original data, convert the structure, and store it in the mapping table
    for d in db_res:
        # Collect tags
        tags = {
            'user_id' : d.get('userId'),
            'username': d.get('username'),
        }

        # Serialize the tags (tag key needs to be sorted to ensure consistent output)
        tags_dump = json.dumps(tags, sort_keys=True, ensure_ascii=True)

        # If the data series for this tag has not been established, create one
        if tags_dump not in series_map:
            # Basic structure of the data series
            series_map[tags_dump] = {
                'columns': [ 'time', 'req_cost', 'req_method', 'req_route' ], # Columns (the first column is fixed as time)
                'tags'   : tags,                                              # Tags
                'values' : [],                                                # Value list
            }

        # Extract time, columns, and append value
        series = series_map[tags_dump]
        value = [
            d.get('createTime') * 1000, # Time (output unit needs to be milliseconds, conversion is needed here)
            d.get('reqCost'),           # Column req_cost
            d.get('reqMethod'),         # Column req_method
            d.get('reqRoute'),          # Column req_route
        ]
        series['values'].append(value)

    # Add the outer structure of DQL
    dql_like_res = {
        # Data series
        'series': [ list(series_map.values()) ] # Note that an extra layer of array is needed here
    }
    return dql_like_res

If you only want to understand data conversion and processing, and are not concerned with the query process (or temporarily do not have an actual database to query), you can refer to the following code:

  • Data Query Function Example (Without MySQL Query Part)
import json

@DFF.API('Query Data from Somewhere', category='dataPlatform.dataQueryFunc')
def query_from_somewhere(time_range):
    # Assume that the original data has been obtained through some method
    db_res = [
        {'createTime': 1730840906, 'reqCost': 23,   'reqMethod': 'POST', 'reqRoute': '/api/v1/scripts/:id/do/modify',  'username': 'admin',  'userId': 'u-001'},
        {'createTime': 1730840906, 'reqCost': 99,   'reqMethod': 'POST', 'reqRoute': '/api/v1/scripts/:id/do/publish', 'username': 'admin',  'userId': 'u-001'},
        {'createTime': 1730863223, 'reqCost': 3941, 'reqMethod': 'POST', 'reqRoute': '/api/v1/scripts/:id/do/publish', 'username': 'zhang3', 'userId': 'u-002'},
        {'createTime': 1730863244, 'reqCost': 159,  'reqMethod': 'POST', 'reqRoute': '/api/v1/scripts/:id/do/publish', 'username': 'zhang3', 'userId': 'u-002'},
        {'createTime': 1730863335, 'reqCost': 44,   'reqMethod': 'POST', 'reqRoute': '/api/v1/scripts/:id/do/publish', 'username': 'li4',    'userId': 'u-003'}
    ]

    # Convert to DQL-like return result

    # Depending on the tags, multiple data series may need to be generated
    # Use the data series tags as Key to create a mapping table
    series_map = {}

    # Traverse the original data, convert the structure, and store it in the mapping table
    for d in db_res:
        # Collect tags
        tags = {
            'user_id' : d.get('userId'),
            'username': d.get('username'),
        }

        # Serialize the tags (tag key needs to be sorted to ensure consistent output)
        tags_dump = json.dumps(tags, sort_keys=True, ensure_ascii=True)

        # If the data series for this tag has not been established, create one
        if tags_dump not in series_map:
            # Basic structure of the data series
            series_map[tags_dump] = {
                'columns': [ 'time', 'req_cost', 'req_method', 'req_route' ], # Columns (the first column is fixed as time)
                'tags'   : tags,                                              # Tags
                'values' : [],                                                # Value list
            }

        # Extract time, columns, and append value
        series = series_map[tags_dump]
        value = [
            d.get('createTime') * 1000, # Time (output unit needs to be milliseconds, conversion is needed here)
            d.get('reqCost'),           # Column req_cost
            d.get('reqMethod'),         # Column req_method
            d.get('reqRoute'),          # Column req_route
        ]
        series['values'].append(value)

    # Add the outer structure of DQL
    dql_like_res = {
        # Data series
        'series': [ list(series_map.values()) ] # Note that an extra layer of array is needed here
    }
    return dql_like_res
  • Return Result Example
{
  "series": [
    [
      {
        "columns": ["time", "req_cost", "req_method", "req_route"],
        "tags": {"user_id": "u-001", "username": "admin"},
        "values": [
          [1730840906000, 23, "POST", "/api/v1/scripts/:id/do/modify" ],
          [1730840906000, 99, "POST", "/api/v1/scripts/:id/do/publish"]
        ]
      },
      {
        "columns": ["time", "req_cost", "req_method", "req_route"],
        "tags": {"user_id": "u-002", "username": "zhang3"},
        "values": [
          [1730863223000, 3941, "POST", "/api/v1/scripts/:id/do/publish"],
          [1730863244000,  159, "POST", "/api/v1/scripts/:id/do/publish"]
        ]
      },
      {
        "columns": ["time", "req_cost", "req_method", "req_route"],
        "tags": {"user_id": "u-003", "username": "li4"},
        "values": [
          [1730863335000, 44, "POST", "/api/v1/scripts/:id/do/publish"]
        ]
      }
    ]
  ]
}

Management List

All connected data sources can be viewed in Integration > External Data Sources > Connected Data Sources.

In the list, you can perform the following operations:

  • View the type, ID, status, creation information, and update information of the data source;
  • Edit a data source, modifying configurations except for DataFlux Func, data source type, and ID;
  • Delete a data source.

Use Cases

In TrueWatch, a typical use case for querying external data sources is Charts > Chart Query.

Different Chart Data Returns
Line Chart Pie Chart Table Chart