Skip to content

Commit d1501bb

Browse files
committed
Add the script for automatic benchmarking
1 parent 37880be commit d1501bb

1 file changed

Lines changed: 11 additions & 2 deletions

File tree

README.md

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -95,15 +95,16 @@ If you experience runtime errors, indicating that the libpbc cannot be found in
9595
```bash
9696
echo $LD_LIBRARY_PATH
9797
```
98-
to ensure the path `usr/local/lib` is in that enviroment variable. You may need to manually add it in if there is no such path inside and meet the corresponding runtime error.
98+
to ensure the path `usr/local/lib` is in that environment variable. You may need to manually add it in if there is no such path inside and meet the corresponding runtime error.
9999

100100
## Artifact Evaluation
101101

102102
### Parameters
103103
As mentioned, the current implementation is a proof-of-concept prototype.To evaluate the proposed protocol, we also implement two test programs to generate synthesis datasets and run our proposed DSSE protocol over them.
104104

105105
#### Dataset Size
106-
The source code of those test programs can be found in the root path of the project, namely `SDSSECQ.cpp` and `SDSSECQS.cpp`. The code in this repository inserts 1000 files with two keywords "Alice" and "Bob", deletes 100 files (10% deletion), and then executes the conjunctive query ("Alice" AND "Bob"). To enlarge the size of dataset, one can modify the above two files by increasing the numbers of insertions/deletions or adding more keywords.
106+
The source code of those test programs can be found in the root path of the project, namely `SDSSECQ.cpp` and `SDSSECQS.cpp`. The code in this repository inserts files with two keywords "Alice" and "Bob", deletes specific amount of the inserted files, and then executes one single keyword query ("Alice") and a conjunctive query ("Alice" AND "Bob").
107+
The parameters used to specify the number of insertions/deletions are set as program arguments when running those programs as shown in the above section.
107108

108109
Besides, as the number of keyword-id pairs increases, we should use a larger Bloom filter to keep the XSet for conjunctive queries. Hence, the `MAX_DB_SIZE`, `XSET_FP`, and `XSET_HASH` in `Util
109110
/CommonUtil.h` should be updated accordingly. Note that the current parameters `MAX_DB_SIZE=100000`, `XSET_FP=0.0000001` and `XSET_HASH=20` can support conjunctive queries against a dataset with 100k keyword-id pairs with less than 10^-7 false positive rate.
@@ -114,6 +115,14 @@ The current parameters are `GGM_FP=0.0001` and `HASH_SIZE=5`, which is designed
114115

115116
### Main Results and Claims
116117

118+
#### Automatic Benchmarking Script
119+
We provide a script for automatic benchmarking.
120+
The script can be executed with the following command under the `Data` folder:
121+
```bash
122+
./Evaluation
123+
```
124+
It will automatically execute required commands to reproduce the following results.
125+
117126
#### Main Result 1: Constant and Small Time For Insertion/Deletion
118127
As shown in `Table 3`, after fixing the parameter, the insertion time and deletion time for each `(keyword, id)` is a constant.
119128
Please refer to `Experiment 1` to see how to reproduce the result.

0 commit comments

Comments
 (0)