Integration Testing

This document mainly introduces how to use the TuGraph integration testing framework.

1.The Significance of TuGraph Integration Testing

In unit tests and function tests, some test cases directly use Galaxy or Statemachine to perform tests, which is not a complete process. In the complete CS architecture, user requests are sent to the server through the client, and network communication is essential. To avoid bugs caused by incomplete unit testing, TuGraph uses an integration testing framework to perform end-to-end testing.

2.TuGraph Integration Testing Framework

TuGraph uses the Pytest framework as its integration testing framework. Pytest is currently the most widely used CS-side integration testing framework. It is known for its flexibility, ease of use, and the ability to support parameterization. Based on the functionality provided by Pytest, TuGraph abstracts different tools and controls the processing logic of each tool through parameters to facilitate efficient testing code development.

For more information on Pytest, please refer to the official website: https://docs.pytest.org/en/7.2.x/getting-started.html

2.1.Component Description

Component Name

Component Function

Implementation Method

server

TuGraph standalone service

Start a child process and launch the service

client

TuGraph Rpc Client

Open TuGraph Python Rpc Client in the current process and send requests

importor

TuGraph Importor

Start a child process to process import requests

exportor

TuGraph Exportor

Start a child process to process export requests

backup_binlog

TuGraph Backup Binlog

Start a child process to process binlog backup requests

backup_copy_dir

TuGraph Backup

Start a child process to process full db backup requests

build_so

Component for compiling C++ dynamic libraries

Start a child process to handle GCC compilation logic

copy_snapshot

TuGraph Copy Snapshot

Handle backup snapshot requests in the current process

copydir

Folder copy

Handle folder copy requests in the current process

exec

Execute C++/Java executable files

Start a child process to launch the C++ executable file

algo

Execute algorithm

Start a child process to run the algorithm

bash

Execute bash commands

Start a child process to execute bash commands

rest

TuGraph Python Rest Client

Open TuGraph Python Rest Client in the current process and send requests

2.2.Component Usage

2.2.1.server

2.2.1.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • cmd is the startup command

  • cleanup_dir is the directory that needs to be cleaned up after execution, which can be multiple, passed in as a Python list

SERVEROPT = {"cmd":"./lgraph_server -c lgraph_standalone.json --directory ./testdb --license _FMA_IGNORE_LICENSE_CHECK_SALTED_ --port 7072 --rpc_port 9092",
             "cleanup_dir":["./testdb"]}
2.2.1.2.Startup Command

Import the tool through the fixtures component and control different processing logic through startup parameters. The server will be started before the function starts executing, and the server will be stopped and the directories specified in cleanup_dir will be cleaned up after the function finishes executing.

@pytest.mark.parametrize("server", [SERVEROPT], indirect=True)
def test_server(self, server):
    pass

2.2.2.client

2.2.2.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • host is the IP and port of the TuGraph Server

  • user is the username of the TuGraph Server

  • password is the password corresponding to the user in the TuGraph Server

CLIENTOPT = {"host":"127.0.0.1:9092", "user":"admin", "password":"73@TuGraph"}
2.2.2.2.Startup Command

Import the tool through the fixtures component and control different processing logic through startup parameters. The client will be started before the function starts executing, and the client will be stopped after the function finishes executing.

@pytest.mark.parametrize("server", [SERVEROPT], indirect=True)
@pytest.mark.parametrize("client", [CLIENTOPT], indirect=True)
def test_client(self, server, client):
    ret = client.callCypher("CALL db.createEdgeLabel('followed', '[]', 'address', string, false, 'date', int32, false)", "default")
    assert ret[0]
    ret = client.callCypher("CALL db.createEdgeLabel('followed', '[]', 'address', string, false, 'date', int32, false)", "default")
    assert ret[0] == False

2.2.3.importor

2.2.3.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • cmd is the startup command

  • cleanup_dir is the directory that needs to be cleaned up after execution, which can be multiple, passed in as a Python list

IMPORTOPT = {"cmd":"./lgraph_import --config_file ./data/yago/yago.conf --dir ./testdb --user admin --password 73@TuGraph --graph default --overwrite 1",
             "cleanup_dir":["./testdb", "./.import_tmp"]}
2.2.3.2.Startup Command

Import the tool through the fixtures component and control the import of different data through startup parameters. The data will be imported to the specified directory before the function starts executing, and the directories specified in cleanup_dir will be cleaned up after the function finishes executing.

@pytest.mark.parametrize("importor", [IMPORTOPT], indirect=True)
def test_importor(self, importor):
    pass

2.2.4.exportor

2.2.4.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • cmd is the startup command

  • cleanup_dir is the directory that needs to be cleaned up after execution, which can be multiple, passed in as a Python list

EXPORT_DEF_OPT = {"cmd":"./lgraph_export -d ./testdb -e ./export/default -g default -u admin -p 73@TuGraph",
                  "cleanup_dir":["./export"]}
2.2.4.2.Startup Command

Import the tool through the fixtures component and control the export of different data through startup parameters. The data will be exported to the specified directory before the function starts executing, and the directories specified in cleanup_dir will be cleaned up after the function finishes executing.

@pytest.mark.parametrize("exportor", [EXPORT_DEF_OPT], indirect=True)
def test_exportor(self, exportor):
    pass

2.2.5.backup_binlog

2.2.5.1.Startup Parameters

采用python字典传入 Use a Python dictionary to pass in the parameters:

  • cmd is the startup command

  • cleanup_dir is the directory that needs to be cleaned up after execution, which can be multiple, passed in as a Python list

BINLOGOPT = {"cmd" : "./lgraph_binlog -a restore --host 127.0.0.1 --port 9093 -u admin -p 73@TuGraph -f ./testdb/binlog/*",
             "cleanup_dir":[]}
2.2.5.2.Startup Command

Import the tool through the fixtures component and control the backup of different binlogs through startup parameters. The binlogs will be copied to the specified directory before the function starts executing, and the directories specified in cleanup_dir will be cleaned up after the function finishes executing.

@pytest.mark.parametrize("backup_binlog", [BINLOGOPT], indirect=True)
def test_backup_binlog(self, backup_binlog):
    pass

2.2.6.backup_copy_dir

2.2.6.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • cmd is the startup command

  • cleanup_dir is the directory that needs to be cleaned up after execution, which can be multiple, passed in as a Python list

BACKUPOPT = {"cmd" : "./lgraph_backup --src ./testdb -dst ./testdb1",
             "cleanup_dir":[]}
2.2.6.2.Startup Command

Import the tool through the fixtures component and control the backup of different databases through startup parameters. The database will be copied to the specified directory before the function starts executing, and the directories specified in cleanup_dir will be cleaned up after the function finishes executing.

@pytest.mark.parametrize("backup_copy_dir", [BACKUPOPT], indirect=True)
def test_backup_copy_dir(self, backup_copy_dir):
	pass

2.2.7.build_so

2.2.7.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • cmd is the startup command, passed in as a Python list, and multiple so can be compiled at once

  • so_name is the dynamic library that needs to be cleaned up after execution, which can be multiple, passed in as a Python list

BUILDOPT = {"cmd":["g++ -fno-gnu-unique -fPIC -g --std=c++17 -I ../../include -I ../../deps/install/include -rdynamic -O3 -fopenmp -DNDEBUG -o ./scan_graph.so ../../test/test_procedures/scan_graph.cpp ./liblgraph.so -shared",
                       "g++ -fno-gnu-unique -fPIC -g --std=c++17 -I ../../include -I ../../deps/install/include -rdynamic -O3 -fopenmp -DNDEBUG -o ./sortstr.so ../../test/test_procedures/sortstr.cpp ./liblgraph.so -shared"],
                "so_name":["./scan_graph.so", "./sortstr.so"]}
2.2.7.2.Startup Command

Import the tool through the fixtures component and control the compilation of different dynamic libraries through startup parameters. The dynamic libraries will be generated to the specified directory before the function starts executing, and the dynamic libraries specified in the so_name list will be cleaned up after the function finishes executing.

@pytest.mark.parametrize("build_so", [BUILDOPT], indirect=True)
def test_build_so(self, build_so):
    pass

2.2.8.copy_snapshot

2.2.8.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • src is the original database

  • dst is the snapshot copied after the backup

COPYSNAPOPT = {"src" : "./testdb", "dst" : "./testdb1"}
2.2.8.2.Startup Command

Import the tool through the fixtures component and control the copying of different snapshots through startup parameters. The snapshot in src will be copied to the directory specified in dst before the function starts executing.

@pytest.mark.parametrize("copy_snapshot", [COPYSNAPOPT], indirect=True)
def test_copy_snapshot(self, copy_snapshot):
    pass

2.2.9.copy_dir

2.2.9.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • src is the original directory

  • dst is the directory copied after the backup

COPYSNAPOPT = {"src" : "./testdb", "dst" : "./testdb1"}
2.2.9.2.Startup Command

Import the tool through the fixtures component and control the copying of different directories through startup parameters. The directory in src will be copied to the directory specified in dst before the function starts executing.

@pytest.mark.parametrize("copy_dir", [COPYDIR], indirect=True)
def test_copy_dir(self, copy_dir):
    pass

2.2.10.exec

2.2.10.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • cmd is the startup command

EXECOPT = {
        "cmd" : "test_rpc_client/cpp/CppClientTest/build/clienttest"
    }
2.2.10.2.Startup Command

Import the tool through the fixtures component and control the execution of different logic through startup parameters. A child process will be started to execute the command passed in through the cmd parameter before the function starts executing.

@pytest.mark.parametrize("exec", [EXECOPT], indirect=True)
def test_exec(self, exec):
        pass

2.2.11.algo

2.2.11.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • cmd is the startup command

  • result is the expected execution result of the algorithm. After execution is completed, the actual result will be compared with the expected result. If they are different, the test will fail.

BFSEMBEDOPT = {
        "cmd" : "algo/bfs_embed ./testdb",
        "result" : ["found_vertices = 3829"]
    }
2.2.11.2.Startup Command

Import the tool through the fixtures component and control the execution of different algorithm logic through startup parameters. A child process will be started to execute the algorithm passed in through the cmd parameter before the function starts executing. The function body will wait for the algorithm to complete and compare the result with the expected result.

@pytest.mark.parametrize("algo", [BFSEMBEDOPT], indirect=True)
def test_exec_bfs_embed(self, algo):
    pass

2.2.12.bash

2.2.12.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • cmd is the startup command

BASHOPT = {
        "cmd" : "sh ./test_rpc_client/cpp/CppClientTest/compile.sh"
    }
2.2.12.2.Startup Command

Import the tool through the fixtures component and control the execution of different bash commands through startup parameters. A child process will be started to execute the bash command passed in through the cmd parameter before the function starts executing. The function body will wait for the command to complete.

@pytest.mark.parametrize("bash", [BASHOPT], indirect=True)
def test_bash(self, bash):
    pass

2.2.13.rest

2.2.13.1.Startup Parameters

Use a Python dictionary to pass in the parameters:

  • port is the port of the TuGraph Server

  • user is the username of the TuGraph Server

  • password is the password corresponding to user in TuGraph Server

RESTTOPT = {"port":"7073", "user":"admin", "password":"73@TuGraph"}
2.2.13.2.Startup Command

Import the tool through the fixtures component and link to different TuGraph Rest Servers through startup parameters. The client will be started before the function starts executing and stopped after the function finishes executing.

@pytest.mark.parametrize("rest", [RESTTOPT], indirect=True)
def test_get_info(self, server, rest):
	pass

2.3.Test Cases

2.3.1.rest

In the sample code, before the test_get_info function is executed, the server is started, and a rest client is started after the server is started. After entering the test_get_info function, some information about the server is obtained, and assert is used to determine whether cpu information is obtained.

SERVEROPT = {"cmd":"./lgraph_server -c lgraph_standalone.json --directory ./testdb --license _FMA_IGNORE_LICENSE_CHECK_SALTED_ --port 7073 --rpc_port 9093",
               "cleanup_dir":["./testdb"]}
RESTTOPT = {"port":"7073", "user":"admin", "password":"73@TuGraph"}
@pytest.mark.parametrize("server", [SERVEROPT], indirect=True)
@pytest.mark.parametrize("rest", [RESTTOPT], indirect=True)
def test_get_info(self, server, rest):
    res = rest.get_server_info()
    log.info("res : %s", res)
    assert('cpu' in res)

2.3.2.client

In the sample code, before the test_flushdb function is executed, the offline data import logic is executed and the server is started. After creating a connection through the client, the function enters the test_flushdb function. The number of points is queried to determine whether the import is successful. After the import is successful, the flushDB operation is executed. assert is used again to determine whether the db can be emptied normally.

SERVEROPT = {"cmd":"./lgraph_server -c lgraph_standalone.json --directory ./testdb --license _FMA_IGNORE_LICENSE_CHECK_SALTED_ --port 7072 --rpc_port 9092",
             "cleanup_dir":["./testdb"]}

CLIENTOPT = {"host":"127.0.0.1:9092", "user":"admin", "password":"73@TuGraph"}

IMPORTOPT = {"cmd":"./lgraph_import --config_file ./data/yago/yago.conf --dir ./testdb --user admin --password 73@TuGraph --graph default --overwrite 1",
             "cleanup_dir":["./testdb", "./.import_tmp"]}

@pytest.mark.parametrize("importor", [IMPORTOPT], indirect=True)
@pytest.mark.parametrize("server", [SERVEROPT], indirect=True)
@pytest.mark.parametrize("client", [CLIENTOPT], indirect=True)
def test_flushdb(self, importor, server, client):
    ret = client.callCypher("MATCH (n) RETURN n LIMIT 100", "default")
    assert ret[0]
    res = json.loads(ret[1])
    assert len(res) == 21
    ret = client.callCypher("CALL db.flushDB()", "default")
    assert ret[0]
    res = json.loads(ret[1])
    assert res == None

2.3.3.exportor/importor

In the sample code, before the test_export_default function is executed, the offline data import logic is executed. After the import is successful, the data of the current db is exported. Then, the offline import logic is used again to import the exported data into a new directory. The newly imported data is used to start the db and create a connection. In the body of the test_export_default function, it is determined whether the data after export and import is consistent with the original data.

SERVEROPT = {"cmd":"./lgraph_server -c lgraph_standalone.json --directory ./testdb1 --license _FMA_IGNORE_LICENSE_CHECK_SALTED_ --port 7073 --rpc_port 9093",
             "cleanup_dir":["./testdb1"]}

CLIENTOPT = {"host":"127.0.0.1:9093", "user":"admin", "password":"73@TuGraph"}

IMPORT_YAGO_OPT = {"cmd":"./lgraph_import --config_file ./data/yago/yago.conf --dir ./testdb --user admin --password 73@TuGraph --graph default --overwrite 1",
             "cleanup_dir":["./.import_tmp", "./testdb"]}

IMPORT_DEF_OPT = {"cmd":"./lgraph_import -c ./export/default/import.config -d ./testdb1",
             "cleanup_dir":["./.import_tmp", "./testdb1"]}

EXPORT_DEF_OPT = {"cmd":"./lgraph_export -d ./testdb -e ./export/default -g default -u admin -p 73@TuGraph",
                  "cleanup_dir":["./export"]}

@pytest.mark.parametrize("importor", [IMPORT_YAGO_OPT], indirect=True)
@pytest.mark.parametrize("exportor", [EXPORT_DEF_OPT], indirect=True)
@pytest.mark.parametrize("importor_1", [IMPORT_DEF_OPT], indirect=True)
@pytest.mark.parametrize("server", [SERVEROPT], indirect=True)
@pytest.mark.parametrize("client", [CLIENTOPT], indirect=True)
def test_export_default(self, importor, exportor, importor_1, server, client):
    ret = client.callCypher("MATCH (n) RETURN n LIMIT 100", "default")
    assert ret[0]
    res = json.loads(ret[1])
    log.info("res : %s", res)
    assert len(res) == 21

2.3.4.Other Tests

For more test cases, please refer to the integration test code https://github.com/TuGraph-db/tugraph-db/tree/master/test/integration