[Scummvm-git-logs] scummvm-sites integrity -> 96f19b176bf8c29ee1e6df13e391e7bdf264726f

sev- noreply at scummvm.org
Thu Nov 7 12:39:28 UTC 2024


This automated email contains information about 120 new commits which have been
pushed to the 'scummvm-sites' repo located at https://github.com/scummvm/scummvm-sites .

Summary:
309b045512 INTEGRITY: Create fileset.py, initiate the flask route
473890e490 INTEGRITY: Rewrite the fileset page in fileset.py
a2d623983c INTEGRITY: Rewrite db_functions.py, pagination.py,
5189cc2698 INTEGRITY: Implement of db_functions.py
ab9096560f INTEGRITY: Implement of user_fileset_functions.py
a16498927b INTEGRITY: Fix fileset page in fileset.py, fix create page func in pagination.py
bffdb5046f INTEGRITY: Add schema.py to generate db tables
47edb3085f INTEGRITY: Add validate page in fileset.py
7aa7b1f61c INTEGRITY: Add CSS file in static folder
e075d66cc7 INTEGRITY: Fix the rendering of pagination.py
46928b4fb3 INTEGRITY: Add page turning in pagination.py
0097a8d34c INTEGRITY: Add the dat loading function in dat_parser.py
51eb58f363 INTEGRITY: Add boundary check in dat_parser.py
566914a80d INTEGRITY: Add logs page in fileset.py
f3ab4dd402 INTEGRITY: Remove the commit of read operation
7ac0958b10 INTEGRITY: Fix create_log in db_functions
41541b38b4 INTEGRITY: Add index page
a4564b1515 INTEGRITY: Delete original php files
55526bee15 INTEGRITY: Fix db_insert func
6a37e435ac INTEGRITY: Add Megadata class
6c2277df48 INTEGRITY: Fix errors in dat_parser
5d2ebc4bcf INTEGRITY: Print num of pages
a5fd396eec INTEGRITY: Add overriding via command line
5f56ff3978 INTEGRITY: Print usage info
462cf99e4f INTEGRITY: Refactoring DB operations
0c00813179 INTEGRITY: Use calc_megakey
369fa216a0 INTEGRITY: Create merge button and merge page
c3c5c3c579 INTEGRITY: Fix merge page
0233dbc831 INTEGRITY: Fix SQL of the merge page
a855acefe3 INTEGRITY: Revert changes of user overriding
899f21b6e6 INTEGRITY: Fix Encoding of dat parser
0689208cdb INTEGRITY: Revert incorrect DB operations
71835b0ef6 INTEGRITY: Fix merge confirm and merge execute page
ef8785aca2 INTEGRITY: Add 'platform' and 'language' to megakey's calc
8b4c270456 INTEGRITY: Improve the query of fileset page
def33f824e INTEGRITY: Add more info when comparing at confirm page
0ddaa1bd98 INTEGRITY: Highlight difference in the data
f897ee2e33 INTEGRITY: Update metadata when megakey matching
962b6b900f INTEGRITY: Add argparse to dat parser
412301dc2f INTEGRIITY: Fix auto merging
12ae44b1e8 INTEGRITY: Fix manual merging
a629f0f90e INTEGRITY: Overriding user via command line
e3ba290aa3 INTEGRITY: Add more info at select page
8d93163fb7 INTEGRITY: Remove redundant bar
b36e57aaea INTEGRITY: Fix the parser of scan dat
92d3dfff1c INTEGRITY: Fix the parser of scan dat
e3a7044a3f INTEGRITY: Handle the dups of scan
afdbbe867a INTEGRITY: Fix the missing bar of widetable
9c8fced591 INTEGRITY: Manual merge into full fileset
5a1fb63dc9 INTEGRITY: Add check to the topbar of "Files in the fileset"
d599894b86 INTEGRITY: Update more info while manual merging
5db72f3e21 INTEGRITY: Remove redundant caption
832914e809 INTEGRITY: Fix bugs when merging scan into detection
3f7fb0e688 INTEGRITY: Start the automatic merge for the scan(unfinished)
2a15a6d03d INTEGRITY: Handle file dups during automatic merging
9ab382695f INTEGRITY: Improve the regex
db34d079d4 INTEGRITY: Add skiplog option to the dat_parser
9b09e2dae5 INTEGRITY: Handle the automatic merge of scan
7e1c4f8ad6 INTEGRITY: Add clear.py for testing
3424b932a8 INTEGRITY: Handle special cases for dat_parser
1e2ba0fa57 INTEGRITY: Fix bugs of auto merging
f48eb94f4a INTEGRITY: Update the detection_type and detection when merging
19f667cb08 INTEGRITY: Add 'detection_type' column to 'file' table
e75afd9985 INTEGRITY: Clear the counters of DB
3f57f0990e INTEGRITY: Add --skiplog option to dat parser
56c370c8c7 INTEGRITY: Insert set.dat to DB
9d6b4ed5d7 INTEGRITY: Show candidates and add merge button to fileset page
7cab1a4d92 INTEGRITY: Add history func
06271a9ff6 INTEGRITY: Implement history table
c1fbcb85c4 INTEGRITY: Add hyperlinks to the log content
a01002860b INTEGRITY: Recursively query the fileset logs
e4e861d3d2 INTEGRITY: Fix duplicate count when displaying matched list
d97df31fe0 INTEGRITY: Add user data check
95bfa16077 INTEGRITY: Add user integrity check interface
5191685df5 INTEGRITY: Fix wrong strcuture in matched_dict
5dc4045edd INTEGRITY: Add extra_map and missing_map
c8c8c58da9 INTEGRITY: Improve the page of user upload
2e5388dfc3 INTEGRITY: Improve the user_integrity_check func
ee7bbb374a INTEGRITY: Enhance the user integrity check page rendering
cf57605913 INTEGRITY: Add timestamp and user_count column
d3bb6ae594 INTEGRITY: Insert the current time into the file table
e6c285484f INTEGRITY: Handle different scenarios of user uploads
1b4b65e1b2 INTEGRITY: Add user_count when matching with 'user'
641a78cc70 INTEGRITY: Implemention of validate page
c2423eb613 INTEGRITY: Change the relative path of mysql config
056a04bd72 INTEGRITY: Change the parameter passing to the user_insert_fileset
a052822916 INTEGRITY: Add user_count when handling duplicate inserts
ee42693336 INTEGRITY: Refactor the logic for user integrity check
b6657c2897 INTEGRITY: Improve the matching of user's json
ba18ccd1d5 INTEGRITY: Create a new fileset first before the matching
44b91b5d4c INTEGRITY: Add hyperlinks to fileset table and game table
7c4d879a41 INTEGRITY: Fix searching error in fileset search page
b61bc99352 INTEGRITY: Change apache2 conf
48299ac8b4 INTEGRITY: Return Unknown when no file matching
97e2371851 INTEGRITY: Delete redundant upload route
efa299e308 INTEGRITY: Redirect user_games_list to fileset_search page
56e5cbcc8e INTEGRITY: Insert metadata when insertinga user fileset
97aa67970a INTEGRITY: Add ready_for_review page
e37205a4d6 INTEGRITY: Add "mark as full" button at fileset page
d384584540 INTEGRITY: improve the matching between `set` and `detection`
1eb60bce71 INTEGRITY: Fix the calculation of result page
9dbb60ccd5 INTEGRITY: Fix the fileset_search page
8db2c9195c INTEGRITY: Fix bugs of matching
a4a67f111d INTEGRITY: Fix dups of log
cdb7c4c480 INTEGRITY: Highlight the detection checksums
09d4d06c77 INTEGRITY: Add sorting to the fileset details
337d92db26 INTEGRITY: Add  checkbox next to each file
9e062fb227 INTEGRITY: Add  checkbox next to each file
895b393b04 INTEGRITY: Fix bugs of widetable
7b1ffb090c INTEGRITY: Remove the delete button
ca0de3904e INTEGRITY: Fix the delete func of fileset
0e5b7c458f INTEGRITY: Delete original fileset after merging
572f8fd3d4 INTEGRITY:  Change the text of widetable
d2a7f99c74 INTEGRITY: Improve the connection of history search
420c9cf6fa INTEGRITY: Update year in README
3f0f18d533 INTEGRITY: Add punycode column
066e615be5 INTEGRITY: Add punycode_need_encode func
5a09b11723 INTEGRITY: Add encode_punycode func
6a4f0e68ff INTEGRITY: Improve the check of non-ASCII
96f19b176b Merge branch 'InariInDream-integrity' into integrity


Commit: 309b0455120dd4568c467f645a24b3e11e4ebf28
    https://github.com/scummvm/scummvm-sites/commit/309b0455120dd4568c467f645a24b3e11e4ebf28
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:40:10+08:00

Commit Message:
INTEGRITY: Create fileset.py, initiate the flask route

Changed paths:
  A fileset.py


diff --git a/fileset.py b/fileset.py
new file mode 100644
index 0000000..870b22d
--- /dev/null
+++ b/fileset.py
@@ -0,0 +1,68 @@
+from flask import Flask, request, render_template, redirect, url_for
+import pymysql.cursors
+import json
+import re
+
+app = Flask(__name__)
+
+# Load MySQL credentials
+with open('mysql_config.json') as f:
+    mysql_cred = json.load(f)
+
+# Connect to the database
+connection = pymysql.connect(host=mysql_cred["servername"],
+                             user=mysql_cred["username"],
+                             password=mysql_cred["password"],
+                             db=mysql_cred["dbname"],
+                             charset='utf8mb4',
+                             cursorclass=pymysql.cursors.DictCursor,
+                             autocommit=False)
+
+ at app.route('/fileset', methods=['GET', 'POST'])
+def fileset():
+    try:
+        with connection.cursor() as cursor:
+            # Check connection
+            cursor.execute("SELECT 1")
+            
+            # Get min and max id
+            cursor.execute("SELECT MIN(id) FROM fileset")
+            min_id = cursor.fetchone()['MIN(id)']
+            cursor.execute("SELECT MAX(id) FROM fileset")
+            max_id = cursor.fetchone()['MAX(id)']
+            
+            # Get id from GET parameters or use min_id
+            id = request.args.get('id', min_id)
+            id = max(min_id, min(int(id), max_id))
+            
+            # Check if id exists in fileset
+            cursor.execute(f"SELECT id FROM fileset WHERE id = {id}")
+            if cursor.rowcount == 0:
+                cursor.execute(f"SELECT fileset FROM history WHERE oldfileset = {id}")
+                id = cursor.fetchone()['fileset']
+            
+            # Get fileset details
+            cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
+            result = cursor.fetchone()
+            
+            # Get files in the fileset
+            cursor.execute(f"SELECT file.id, name, size, checksum, detection FROM file WHERE fileset = {id}")
+            files = cursor.fetchall()
+            
+            # Get history and logs
+            cursor.execute(f"SELECT `timestamp`, oldfileset, log FROM history WHERE fileset = {id} ORDER BY `timestamp`")
+            history = cursor.fetchall()
+            
+            cursor.execute(f"SELECT `timestamp`, category, `text`, id FROM log WHERE `text` REGEXP 'Fileset:{id}' ORDER BY `timestamp` DESC, id DESC")
+            logs = cursor.fetchall()
+            
+            # Commit the transaction
+            connection.commit()
+            
+            # Render the results in a template
+            return render_template('fileset.html', result=result, files=files, history=history, logs=logs)
+    finally:
+        connection.close()
+
+if __name__ == '__main__':
+    app.run()
\ No newline at end of file


Commit: 473890e4905f52970b3bab68e02b76c86c8df570
    https://github.com/scummvm/scummvm-sites/commit/473890e4905f52970b3bab68e02b76c86c8df570
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:40:20+08:00

Commit Message:
INTEGRITY: Rewrite the fileset page in fileset.py

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 870b22d..90b6852 100644
--- a/fileset.py
+++ b/fileset.py
@@ -1,68 +1,227 @@
-from flask import Flask, request, render_template, redirect, url_for
+from flask import Flask, request, render_template, redirect, url_for, render_template_string
 import pymysql.cursors
 import json
 import re
+import os
+from user_fileset_functions import user_calc_key, file_json_to_array, user_insert_queue, user_insert_fileset, match_and_merge_user_filesets
 
 app = Flask(__name__)
 
-# Load MySQL credentials
 with open('mysql_config.json') as f:
     mysql_cred = json.load(f)
 
-# Connect to the database
-connection = pymysql.connect(host=mysql_cred["servername"],
-                             user=mysql_cred["username"],
-                             password=mysql_cred["password"],
-                             db=mysql_cred["dbname"],
-                             charset='utf8mb4',
-                             cursorclass=pymysql.cursors.DictCursor,
-                             autocommit=False)
+conn = pymysql.connect(
+    host=mysql_cred["servername"],
+    user=mysql_cred["username"],
+    password=mysql_cred["password"],
+    db=mysql_cred["dbname"],
+    charset='utf8mb4',
+    cursorclass=pymysql.cursors.DictCursor,
+    autocommit=False
+)
 
 @app.route('/fileset', methods=['GET', 'POST'])
 def fileset():
-    try:
-        with connection.cursor() as cursor:
-            # Check connection
-            cursor.execute("SELECT 1")
-            
-            # Get min and max id
-            cursor.execute("SELECT MIN(id) FROM fileset")
-            min_id = cursor.fetchone()['MIN(id)']
-            cursor.execute("SELECT MAX(id) FROM fileset")
-            max_id = cursor.fetchone()['MAX(id)']
-            
-            # Get id from GET parameters or use min_id
-            id = request.args.get('id', min_id)
+    id = request.args.get('id')
+    with conn.cursor() as cursor:
+        cursor.execute("SELECT MIN(id) AS min_id FROM fileset")
+        min_id = cursor.fetchone()['min_id']
+        
+        if not id:
+            id = min_id
+        else:
+            cursor.execute("SELECT MAX(id) AS max_id FROM fileset")
+            max_id = cursor.fetchone()['max_id']
             id = max(min_id, min(int(id), max_id))
-            
-            # Check if id exists in fileset
-            cursor.execute(f"SELECT id FROM fileset WHERE id = {id}")
+            cursor.execute("SELECT id FROM fileset WHERE id = %s", (id,))
             if cursor.rowcount == 0:
-                cursor.execute(f"SELECT fileset FROM history WHERE oldfileset = {id}")
+                cursor.execute("SELECT fileset FROM history WHERE oldfileset = %s", (id,))
                 id = cursor.fetchone()['fileset']
-            
-            # Get fileset details
-            cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
-            result = cursor.fetchone()
-            
-            # Get files in the fileset
-            cursor.execute(f"SELECT file.id, name, size, checksum, detection FROM file WHERE fileset = {id}")
-            files = cursor.fetchall()
-            
-            # Get history and logs
-            cursor.execute(f"SELECT `timestamp`, oldfileset, log FROM history WHERE fileset = {id} ORDER BY `timestamp`")
-            history = cursor.fetchall()
-            
-            cursor.execute(f"SELECT `timestamp`, category, `text`, id FROM log WHERE `text` REGEXP 'Fileset:{id}' ORDER BY `timestamp` DESC, id DESC")
-            logs = cursor.fetchall()
-            
-            # Commit the transaction
-            connection.commit()
-            
-            # Render the results in a template
-            return render_template('fileset.html', result=result, files=files, history=history, logs=logs)
-    finally:
-        connection.close()
+
+        cursor.execute("SELECT * FROM fileset WHERE id = %s", (id,))
+        result = cursor.fetchone()
+
+        if result['game']:
+            cursor.execute("""
+                SELECT game.name AS 'game name', engineid, gameid, extra, platform, language
+                FROM fileset
+                JOIN game ON game.id = fileset.game
+                JOIN engine ON engine.id = game.engine
+                WHERE fileset.id = %s
+            """, (id,))
+            result.update(cursor.fetchone())
+        else:
+            result.pop('key', None)
+            result.pop('status', None)
+            result.pop('delete', None)
+
+        fileset_details = result
+
+        cursor.execute("SELECT file.id, name, size, checksum, detection FROM file WHERE fileset = %s", (id,))
+        files = cursor.fetchall()
+
+        if request.args.get('widetable') == 'true':
+            for file in files:
+                cursor.execute("SELECT checksum, checksize, checktype FROM filechecksum WHERE file = %s", (file['id'],))
+                checksums = cursor.fetchall()
+                for checksum in checksums:
+                    if checksum['checksize'] != 0:
+                        file[f"{checksum['checktype']}-{checksum['checksize']}"] = checksum['checksum']
+
+        cursor.execute("""
+            SELECT `timestamp`, oldfileset, log
+            FROM history
+            WHERE fileset = %s
+            ORDER BY `timestamp`
+        """, (id,))
+        history = cursor.fetchall()
+
+        cursor.execute("""
+            SELECT `timestamp`, category, `text`, id
+            FROM log
+            WHERE `text` REGEXP %s
+            ORDER BY `timestamp` DESC, id DESC
+        """, (f'Fileset:{id}',))
+        logs = cursor.fetchall()
+
+        for history_row in history:
+            cursor.execute("""
+                SELECT `timestamp`, category, `text`, id
+                FROM log
+                WHERE `text` REGEXP %s
+                AND `category` NOT REGEXP 'merge'
+                ORDER BY `timestamp` DESC, id DESC
+            """, (f'Fileset:{history_row["oldfileset"]}',))
+            logs.extend(cursor.fetchall())
+
+    if request.method == 'POST':
+        if 'delete' in request.form:
+            with conn.cursor() as cursor:
+                cursor.execute("UPDATE fileset SET `delete` = TRUE WHERE id = %s", (request.form['delete'],))
+                conn.commit()
+        if 'match' in request.form:
+            match_and_merge_user_filesets(request.form['match'])
+            return redirect(url_for('fileset', id=request.form['match']))
+
+    return render_template_string("""
+    <!DOCTYPE html>
+    <html>
+    <head>
+        <link rel="stylesheet" href="{{ stylesheet }}">
+        <script type="text/javascript" src="{{ jquery_file }}"></script>
+        <script type="text/javascript" src="{{ js_file }}"></script>
+    </head>
+    <body>
+        <h2><u>Fileset: {{ id }}</u></h2>
+        <h3>Fileset details</h3>
+        <table>
+            {% for key, value in fileset_details.items() %}
+                {% if key not in ['id', 'game'] %}
+                    <tr><th>{{ key }}</th><td>{{ value }}</td></tr>
+                {% endif %}
+            {% endfor %}
+        </table>
+        <h3>Files in the fileset</h3>
+        <form method="get">
+            {% for key, value in request.args.items() %}
+                {% if key != 'widetable' %}
+                    <input type="hidden" name="{{ key }}" value="{{ value }}">
+                {% endif %}
+            {% endfor %}
+            {% if request.args.get('widetable') == 'true' %}
+                <input type="hidden" name="widetable" value="false">
+                <input type="submit" value="Hide extra checksums">
+            {% else %}
+                <input type="hidden" name="widetable" value="true">
+                <input type="submit" value="Expand Table">
+            {% endif %}
+        </form>
+        <table>
+            {% if files %}
+                <tr>
+                    <th>#</th>
+                    {% for key in files[0].keys() %}
+                        {% if key != 'id' %}
+                            <th>{{ key }}</th>
+                        {% endif %}
+                    {% endfor %}
+                </tr>
+                {% for i, file in enumerate(files, 1) %}
+                    <tr>
+                        <td>{{ i }}</td>
+                        {% for key, value in file.items() %}
+                            {% if key != 'id' %}
+                                <td>{{ value }}</td>
+                            {% endif %}
+                        {% endfor %}
+                    </tr>
+                {% endfor %}
+            {% endif %}
+        </table>
+        <h3>Developer Actions</h3>
+        <form method="post">
+            <button type="submit" name="delete" value="{{ id }}">Mark Fileset for Deletion</button>
+            <button type="submit" name="match" value="{{ id }}">Match and Merge Fileset</button>
+        </form>
+        <h3>Fileset history</h3>
+        <table>
+            <tr>
+                <th>Timestamp</th>
+                <th>Category</th>
+                <th>Description</th>
+                <th>Log ID</th>
+            </tr>
+            {% for log in logs %}
+                <tr>
+                    <td>{{ log.timestamp }}</td>
+                    <td>{{ log.category }}</td>
+                    <td>{{ log.text }}</td>
+                    <td><a href="logs.php?id={{ log.id }}">{{ log.id }}</a></td>
+                </tr>
+            {% endfor %}
+        </table>
+    </body>
+    </html>
+    """, id=id, fileset_details=fileset_details, files=files, logs=logs, stylesheet='style.css', jquery_file='https://code.jquery.com/jquery-3.7.0.min.js', js_file='js_functions.js')
+
+
+def get_join_columns(table1, table2, mapping):
+    for primary, foreign in mapping.items():
+        primary = primary.split('.')
+        foreign = foreign.split('.')
+        if (primary[0] == table1 and foreign[0] == table2) or (primary[0] == table2 and foreign[0] == table1):
+            return f"{primary[0]}.{primary[1]} = {foreign[0]}.{foreign[1]}"
+    return "No primary-foreign key mapping provided. Filter is invalid"
+
+ at app.route('/create_page', methods=['GET'])
+def create_page():
+    filename = 'filename'
+    results_per_page = 10
+    records_table = 'records_table'
+    select_query = 'select_query'
+    order = 'order'
+    filters = {}
+    mapping = {}
+
+    mysql_cred = json.load(open(os.path.join(os.path.dirname(__file__), '../mysql_config.json')))
+    connection = pymysql.connect(host=mysql_cred["servername"],
+                                 user=mysql_cred["username"],
+                                 password=mysql_cred["password"],
+                                 db=mysql_cred["dbname"],
+                                 charset='utf8mb4',
+                                 cursorclass=pymysql.cursors.DictCursor)
+
+    with connection.cursor() as cursor:
+        # TODO: Implement the logic to handle the GET parameters and construct the SQL query
+        # similar logic as the PHP code to handle the GET parameters, construct the SQL query, execute it and fetch the results
+        # ...
+        pass
+    
+    # TODO: Implement the logic to construct the HTML table and pagination elements
+    # similar logic as the PHP code to construct the HTML table and pagination elements
+    # ...
+
+    return render_template("fileset.html")
 
 if __name__ == '__main__':
     app.run()
\ No newline at end of file


Commit: a2d623983c7401b469295d43bbfef7a51c4412e4
    https://github.com/scummvm/scummvm-sites/commit/a2d623983c7401b469295d43bbfef7a51c4412e4
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:40:20+08:00

Commit Message:
INTEGRITY: Rewrite db_functions.py, pagination.py,
user_fileset_funtions.py

Changed paths:
  A db_functions.py
  A pagination.py
  A user_fileset_functions.py


diff --git a/db_functions.py b/db_functions.py
new file mode 100644
index 0000000..b595dc8
--- /dev/null
+++ b/db_functions.py
@@ -0,0 +1,89 @@
+import pymysql
+import json
+
+def db_connect():
+    with open('mysql_config.json') as f:
+        mysql_cred = json.load(f)
+    
+    conn = pymysql.connect(
+        host=mysql_cred["servername"],
+        user=mysql_cred["username"],
+        password=mysql_cred["password"],
+        db=mysql_cred["dbname"],
+        charset='utf8mb4',
+        cursorclass=pymysql.cursors.DictCursor,
+        autocommit=False
+    )
+    
+    return conn
+
+def insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, ip):
+    query = """
+        INSERT INTO fileset (source, detection, `key`, megakey, `transaction`, log, ip)
+        VALUES (%s, %s, %s, %s, %s, %s, %s)
+    """
+    try:
+        with conn.cursor() as cursor:
+            cursor.execute(query, (src, int(detection), key, megakey, transaction_id, log_text, ip))
+            conn.commit()
+            cursor.execute("SET @fileset_last = LAST_INSERT_ID()")
+        return True
+    except pymysql.MySQLError as e:
+        print(f"Insert fileset failed: {e}")
+        return False
+
+def insert_file(file, detection, src, conn):
+    query = """
+        INSERT INTO file (name, size, detection, source, fileset)
+        VALUES (%s, %s, %s, %s, @fileset_last)
+    """
+    try:
+        with conn.cursor() as cursor:
+            cursor.execute(query, (file['name'], file['size'], int(detection), src))
+            conn.commit()
+            cursor.execute("SET @file_last = LAST_INSERT_ID()")
+        return True
+    except pymysql.MySQLError as e:
+        print(f"Insert file failed: {e}")
+        return False
+
+def insert_filechecksum(file, key, conn):
+    query = """
+        INSERT INTO filechecksum (file, checksum, checktype)
+        VALUES (@file_last, %s, %s)
+    """
+    try:
+        with conn.cursor() as cursor:
+            cursor.execute(query, (file[key], key))
+            conn.commit()
+        return True
+    except pymysql.MySQLError as e:
+        print(f"Insert file checksum failed: {e}")
+        return False
+
+def find_matching_game(fileset):
+    # TODO: Implement logic to find matching game for a fileset
+    pass
+
+def merge_filesets(fileset1, fileset2):
+    # TODO: Implement logic to merge two filesets
+    pass
+
+def create_log(category, user, text, conn):
+    query = """
+        INSERT INTO log (category, user, text)
+        VALUES (%s, %s, %s)
+    """
+    try:
+        with conn.cursor() as cursor:
+            cursor.execute(query, (category, user, text))
+            conn.commit()
+            cursor.execute("SET @log_last = LAST_INSERT_ID()")
+        return True
+    except pymysql.MySQLError as e:
+        print(f"Insert log failed: {e}")
+        return False
+
+def get_current_user():
+    # Implement logic to get current user
+    pass
\ No newline at end of file
diff --git a/pagination.py b/pagination.py
new file mode 100644
index 0000000..e0ef77a
--- /dev/null
+++ b/pagination.py
@@ -0,0 +1,152 @@
+from urllib.parse import urlencode
+import pymysql
+import json
+import math
+from flask import Flask, request, render_template_string
+
+with open('mysql_config.json') as f:
+    mysql_cred = json.load(f)
+
+conn = pymysql.connect(
+    host=mysql_cred["servername"],
+    user=mysql_cred["username"],
+    password=mysql_cred["password"],
+    db=mysql_cred["dbname"],
+    charset='utf8mb4',
+    cursorclass=pymysql.cursors.DictCursor,
+    autocommit=False
+)
+
+
+def get_join_columns(table1, table2, mapping):
+    for primary, foreign in mapping.items():
+        primary = primary.split('.')
+        foreign = foreign.split('.')
+        if (primary[0] == table1 and foreign[0] == table2) or (primary[0] == table2 and foreign[0] == table1):
+            return f"{primary[0]}.{primary[1]} = {foreign[0]}.{foreign[1]}"
+    raise ValueError("No primary-foreign key mapping provided. Filter is invalid")
+
+def create_page(filename, results_per_page, records_table, select_query, order, filters=None, mapping=None):
+    if filters is None:
+        filters = {}
+    if mapping is None:
+        mapping = {}
+
+    with conn.cursor() as cursor:
+        # If there exist get variables that are for filtering
+        get_params = {k: v for k, v in request.args.items() if v}
+        if 'sort' in get_params:
+            column = get_params.pop('sort')
+            order = f"ORDER BY {column.split('-')[0]}"
+            if 'desc' in column:
+                order += " DESC"
+
+        tables = list(set(filters.values()))
+        condition = "WHERE " + " AND ".join([f"{filters[k]}.{k} REGEXP '{v}'" for k, v in get_params.items() if k != 'page']) if get_params else ""
+        
+        from_query = records_table
+        if len(tables) > 1 or (tables and tables[0] != records_table):
+            for table in tables:
+                if table != records_table:
+                    from_query += f" JOIN {table} ON {get_join_columns(records_table, table, mapping)}"
+
+        count_query = f"SELECT COUNT({records_table}.id) FROM {from_query} {condition}"
+        cursor.execute(count_query)
+        num_of_results = cursor.fetchone()['COUNT({records_table}.id)']
+        num_of_pages = math.ceil(num_of_results / results_per_page)
+
+        page = max(1, min(int(get_params.pop('page', 1)), num_of_pages))
+        offset = (page - 1) * results_per_page
+
+        query = f"{select_query} {condition} {order} LIMIT {results_per_page} OFFSET {offset}"
+        cursor.execute(query)
+        results = cursor.fetchall()
+
+    return render_template_string("""
+    <!DOCTYPE html>
+    <html>
+    <head>
+        <link rel="stylesheet" href="{{ stylesheet }}">
+        <script type="text/javascript" src="{{ jquery_file }}"></script>
+        <script type="text/javascript" src="{{ js_file }}"></script>
+    </head>
+    <body>
+        <form id='filters-form' method='GET' onsubmit='remove_empty_inputs()'>
+        <table>
+            {% if results %}
+                <tr class="filter">
+                    <td></td>
+                    {% for key in results[0].keys() %}
+                        {% if key in filters %}
+                            <td class="filter">
+                                <input type="text" class="filter" placeholder="{{ key }}" name="{{ key }}" value="{{ request.args.get(key, '') }}"/>
+                            </td>
+                        {% else %}
+                            <td class="filter"></td>
+                        {% endif %}
+                    {% endfor %}
+                </tr>
+                <tr class="filter">
+                    <td></td>
+                    <td class="filter"><input type="submit" value="Submit"></td>
+                </tr>
+                <tr>
+                    <th></th>
+                    {% for key in results[0].keys() %}
+                        {% if key != 'fileset' %}
+                            <th><a href="{{ url_for('create_page', **{**request.args, 'sort': key}) }}">{{ key }}</a></th>
+                        {% endif %}
+                    {% endfor %}
+                </tr>
+                {% for i, row in enumerate(results, start=offset+1) %}
+                    <tr>
+                        <td>{{ i }}</td>
+                        {% for key, value in row.items() %}
+                            {% if key != 'fileset' %}
+                                <td>{{ value }}</td>
+                            {% endif %}
+                        {% endfor %}
+                    </tr>
+                {% endfor %}
+            {% endif %}
+        </table>
+        </form>
+
+        <div class="pagination">
+            {% if num_of_pages > 1 %}
+                <form method="GET">
+                    {% for key, value in request.args.items() %}
+                        {% if key != 'page' %}
+                            <input type="hidden" name="{{ key }}" value="{{ value }}">
+                        {% endif %}
+                    {% endfor %}
+                    {% if page > 1 %}
+                        <a href="{{ url_for('create_page', **{**request.args, 'page': 1}) }}">❮❮</a>
+                        <a href="{{ url_for('create_page', **{**request.args, 'page': page-1}) }}">❮</a>
+                    {% endif %}
+                    {% if page - 2 > 1 %}
+                        <div class="more">...</div>
+                    {% endif %}
+                    {% for i in range(max(1, page-2), min(num_of_pages+1, page+3)) %}
+                        {% if i == page %}
+                            <a class="active" href="{{ url_for('create_page', **{**request.args, 'page': i}) }}">{{ i }}</a>
+                        {% else %}
+                            <a href="{{ url_for('create_page', **{**request.args, 'page': i}) }}">{{ i }}</a>
+                        {% endif %}
+                    {% endfor %}
+                    {% if page + 2 < num_of_pages %}
+                        <div class="more">...</div>
+                    {% endif %}
+                    {% if page < num_of_pages %}
+                        <a href="{{ url_for('create_page', **{**request.args, 'page': page+1}) }}">❯</a>
+                        <a href="{{ url_for('create_page', **{**request.args, 'page': num_of_pages}) }}">❯❯</a>
+                    {% endif %}
+                    <input type="text" name="page" placeholder="Page No">
+                    <input type="submit" value="Submit">
+                </form>
+            {% endif %}
+        </div>
+    </body>
+    </html>
+    """, results=results, filters=filters, request=request.args, offset=offset, num_of_pages=num_of_pages, page=page, filename=filename, stylesheet='style.css', jquery_file='https://code.jquery.com/jquery-3.7.0.min.js', js_file='js_functions.js')
+
diff --git a/user_fileset_functions.py b/user_fileset_functions.py
new file mode 100644
index 0000000..6ff10f9
--- /dev/null
+++ b/user_fileset_functions.py
@@ -0,0 +1,128 @@
+import hashlib
+import time
+from db_functions import db_connect, insert_fileset, insert_file, insert_filechecksum, find_matching_game, merge_filesets, create_log, get_current_user
+
+def user_calc_key(user_fileset):
+    key_string = ""
+    for file in user_fileset:
+        for key, value in file.items():
+            if key != 'checksums':
+                key_string += ':' + str(value)
+                continue
+            for checksum_pair in value:
+                key_string += ':' + checksum_pair['checksum']
+    key_string = key_string.strip(':')
+    return hashlib.md5(key_string.encode()).hexdigest()
+
+def file_json_to_array(file_json_object):
+    res = {}
+    for key, value in file_json_object.items():
+        if key != 'checksums':
+            res[key] = value
+            continue
+        for checksum_pair in value:
+            res[checksum_pair['type']] = checksum_pair['checksum']
+    return res
+
+def user_insert_queue(user_fileset, conn):
+    query = "INSERT INTO queue (time, notes, fileset, ticketid, userid, commit) VALUES (%s, NULL, @fileset_last, NULL, NULL, NULL)"
+    with conn.cursor() as cursor:
+        cursor.execute(query, (int(time.time()),))
+    conn.commit()
+
+def user_insert_fileset(user_fileset, ip, conn):
+    src = 'user'
+    detection = False
+    key = ''
+    megakey = user_calc_key(user_fileset)
+    with conn.cursor() as cursor:
+        cursor.execute("SELECT MAX(`transaction`) FROM transactions")
+        transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
+        log_text = "from user submitted files"
+        cursor.execute("SET @fileset_time_last = %s", (int(time.time()),))
+        if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, ip):
+            for file in user_fileset:
+                file = file_json_to_array(file)
+                insert_file(file, detection, src, conn)
+                for key, value in file.items():
+                    if key not in ["name", "size"]:
+                        insert_filechecksum(file, key, conn)
+        cursor.execute("SELECT @fileset_last")
+        fileset_id = cursor.fetchone()['@fileset_last']
+    conn.commit()
+    return fileset_id
+
+def match_and_merge_user_filesets(id, conn):
+    with conn.cursor() as cursor:
+        cursor.execute("""
+            SELECT fileset.id, filechecksum.checksum, src, status
+            FROM fileset
+            JOIN file ON file.fileset = fileset.id
+            JOIN filechecksum ON file.id = filechecksum.file
+            WHERE status = 'user' AND fileset.id = %s
+        """, (id,))
+        unmatched_files = cursor.fetchall()
+
+    unmatched_filesets = []
+    cur_fileset = None
+    temp = []
+    for file in unmatched_files:
+        if cur_fileset is None or cur_fileset != file['id']:
+            if temp:
+                unmatched_filesets.append(temp)
+            cur_fileset = file['id']
+            temp = []
+        temp.append(file)
+    if temp:
+        unmatched_filesets.append(temp)
+
+    for fileset in unmatched_filesets:
+        matching_games = find_matching_game(fileset)
+        if len(matching_games) != 1:
+            continue
+        matched_game = matching_games[0]
+        status = 'fullmatch'
+        matched_game = {k: ("NULL" if v is None else v) for k, v in matched_game.items()}
+        category_text = f"Matched from {fileset[0]['src']}"
+        log_text = f"Matched game {matched_game['engineid']}: {matched_game['gameid']}-{matched_game['platform']}-{matched_game['language']} variant {matched_game['key']}. State {status}. Fileset:{fileset[0]['id']}."
+        query = """
+            UPDATE fileset
+            SET game = %s, status = %s, `key` = %s
+            WHERE id = %s
+        """
+        history_last = merge_filesets(matched_game["fileset"], fileset[0]['id'])
+        with conn.cursor() as cursor:
+            cursor.execute(query, (matched_game["id"], status, matched_game["key"], fileset[0]['id']))
+            user = 'cli:' + get_current_user()
+            create_log("Fileset merge", user, f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0]['id']}")
+            log_last = create_log(category_text, user, log_text)
+            cursor.execute("UPDATE history SET log = %s WHERE id = %s", (log_last, history_last))
+        conn.commit()
+
+def insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, ip):
+    
+    pass
+
+def insert_file(file, detection, src, conn):
+    
+    pass
+
+def insert_filechecksum(file, key, conn):
+    
+    pass
+
+def find_matching_game(fileset):
+    
+    pass
+
+def merge_filesets(fileset1, fileset2):
+    
+    pass
+
+def create_log(category, user, text):
+    
+    pass
+
+def get_current_user():
+    
+    pass
\ No newline at end of file


Commit: 5189cc2698cf020a81c49be429003f29f3460e61
    https://github.com/scummvm/scummvm-sites/commit/5189cc2698cf020a81c49be429003f29f3460e61
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:40:20+08:00

Commit Message:
INTEGRITY: Implement of db_functions.py

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index b595dc8..496d0c6 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -1,5 +1,10 @@
 import pymysql
 import json
+from collections import Counter
+import getpass
+import time
+import hashlib
+import os
 
 def db_connect():
     with open('mysql_config.json') as f:
@@ -17,73 +22,428 @@ def db_connect():
     
     return conn
 
-def insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, ip):
-    query = """
-        INSERT INTO fileset (source, detection, `key`, megakey, `transaction`, log, ip)
-        VALUES (%s, %s, %s, %s, %s, %s, %s)
-    """
-    try:
+def get_checksum_props(checkcode, checksum):
+    checksize = 0
+    checktype = checkcode
+
+    if '-' in checkcode:
+        exploded_checkcode = checkcode.split('-')
+        last = exploded_checkcode.pop()
+        if last == '1M' or last.isdigit():
+            checksize = last
+
+        checktype = '-'.join(exploded_checkcode)
+
+    # Detection entries have checktypes as part of the checksum prefix
+    if ':' in checksum:
+        prefix = checksum.split(':')[0]
+        checktype += "-" + prefix
+
+        checksum = checksum.split(':')[1]
+
+    return checksize, checktype, checksum
+
+def insert_game(engine_name, engineid, title, gameid, extra, platform, lang, conn):
+    # Set @engine_last if engine already present in table
+    exists = False
+    with conn.cursor() as cursor:
+        cursor.execute(f"SELECT id FROM engine WHERE engineid = '{engineid}'")
+        res = cursor.fetchone()
+        if res is not None:
+            exists = True
+            cursor.execute(f"SET @engine_last = '{res[0]}'")
+
+    # Insert into table if not present
+    if not exists:
         with conn.cursor() as cursor:
-            cursor.execute(query, (src, int(detection), key, megakey, transaction_id, log_text, ip))
-            conn.commit()
-            cursor.execute("SET @fileset_last = LAST_INSERT_ID()")
-        return True
-    except pymysql.MySQLError as e:
-        print(f"Insert fileset failed: {e}")
-        return False
+            cursor.execute(f"INSERT INTO engine (name, engineid) VALUES ('{pymysql.escape_string(engine_name)}', '{engineid}')")
+            cursor.execute("SET @engine_last = LAST_INSERT_ID()")
 
-def insert_file(file, detection, src, conn):
-    query = """
-        INSERT INTO file (name, size, detection, source, fileset)
-        VALUES (%s, %s, %s, %s, @fileset_last)
-    """
-    try:
+    # Insert into game
+    with conn.cursor() as cursor:
+        cursor.execute(f"INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES ('{pymysql.escape_string(title)}', @engine_last, '{gameid}', '{pymysql.escape_string(extra)}', '{platform}', '{lang}')")
+        cursor.execute("SET @game_last = LAST_INSERT_ID()")
+
+def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip=''):
+    status = "detection" if detection else src
+    game = "NULL"
+    key = "NULL" if key == "" else f"'{key}'"
+    megakey = "NULL" if megakey == "" else f"'{megakey}'"
+
+    if detection:
+        status = "detection"
+        game = "@game_last"
+
+    # Check if key/megakey already exists, if so, skip insertion (no quotes on purpose)
+    with conn.cursor() as cursor:
+        if detection:
+            cursor.execute(f"SELECT id FROM fileset WHERE `key` = {key}")
+        else:
+            cursor.execute(f"SELECT id FROM fileset WHERE megakey = {megakey}")
+
+        existing_entry = cursor.fetchone()
+
+    if existing_entry is not None:
+        existing_entry = existing_entry[0]
         with conn.cursor() as cursor:
-            cursor.execute(query, (file['name'], file['size'], int(detection), src))
-            conn.commit()
-            cursor.execute("SET @file_last = LAST_INSERT_ID()")
-        return True
-    except pymysql.MySQLError as e:
-        print(f"Insert file failed: {e}")
-        return False
+            cursor.execute(f"SET @fileset_last = {existing_entry}")
+
+        category_text = f"Uploaded from {src}"
+        log_text = f"Duplicate of Fileset:{existing_entry}, {log_text}"
+        if src == 'user':
+            log_text = f"Duplicate of Fileset:{existing_entry}, from user IP {ip}, {log_text}"
+
+        user = f'cli:{getpass.getuser()}'
+        create_log(pymysql.escape_string(category_text), user, pymysql.escape_string(log_text))
+
+        if not detection:
+            return False
 
-def insert_filechecksum(file, key, conn):
-    query = """
-        INSERT INTO filechecksum (file, checksum, checktype)
-        VALUES (@file_last, %s, %s)
-    """
-    try:
         with conn.cursor() as cursor:
-            cursor.execute(query, (file[key], key))
-            conn.commit()
-        return True
-    except pymysql.MySQLError as e:
-        print(f"Insert file checksum failed: {e}")
+            cursor.execute(f"UPDATE fileset SET `timestamp` = FROM_UNIXTIME(@fileset_time_last) WHERE id = {existing_entry}")
+            cursor.execute("UPDATE fileset SET status = 'detection' WHERE id = {existing_entry} AND status = 'obsolete'")
+            cursor.execute("DELETE FROM game WHERE id = @game_last")
         return False
 
-def find_matching_game(fileset):
-    # TODO: Implement logic to find matching game for a fileset
-    pass
+    # $game and $key should not be parsed as a mysql string, hence no quotes
+    query = f"INSERT INTO fileset (game, status, src, `key`, megakey, `timestamp`) VALUES ({game}, '{status}', '{src}', {key}, {megakey}, FROM_UNIXTIME(@fileset_time_last))"
+    with conn.cursor() as cursor:
+        cursor.execute(query)
+        cursor.execute("SET @fileset_last = LAST_INSERT_ID()")
+
+    category_text = f"Uploaded from {src}"
+    with conn.cursor() as cursor:
+        cursor.execute("SELECT @fileset_last")
+        fileset_last = cursor.fetchone()[0]
+
+    log_text = f"Created Fileset:{fileset_last}, {log_text}"
+    if src == 'user':
+        log_text = f"Created Fileset:{fileset_last}, from user IP {ip}, {log_text}"
+
+    user = f'cli:{getpass.getuser()}'
+    create_log(pymysql.escape_string(category_text), user, pymysql.escape_string(log_text))
+    with conn.cursor() as cursor:
+        cursor.execute(f"INSERT INTO transactions (`transaction`, fileset) VALUES ({transaction}, {fileset_last})")
+
+    return True
+
+def insert_file(file, detection, src, conn):
+    # Find full md5, or else use first checksum value
+    checksum = ""
+    checksize = 5000
+    if "md5" in file:
+        checksum = file["md5"]
+    else:
+        for key, value in file.items():
+            if "md5" in key:
+                checksize, checktype, checksum = get_checksum_props(key, value)
+                break
+
+    query = f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{pymysql.escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection})"
+    with conn.cursor() as cursor:
+        cursor.execute(query)
+
+    if detection:
+        with conn.cursor() as cursor:
+            cursor.execute(f"UPDATE fileset SET detection_size = {checksize} WHERE id = @fileset_last AND detection_size IS NULL")
+    with conn.cursor() as cursor:
+        cursor.execute("SET @file_last = LAST_INSERT_ID()")
+
+def insert_filechecksum(file, checktype, conn):
+    if checktype not in file:
+        return
+
+    checksum = file[checktype]
+    checksize, checktype, checksum = get_checksum_props(checktype, checksum)
+
+    query = f"INSERT INTO filechecksum (file, checksize, checktype, checksum) VALUES (@file_last, '{checksize}', '{checktype}', '{checksum}')"
+    with conn.cursor() as cursor:
+        cursor.execute(query)
+
+def delete_filesets(conn):
+    query = "DELETE FROM fileset WHERE `delete` = TRUE"
+    with conn.cursor() as cursor:
+        cursor.execute(query)
 
-def merge_filesets(fileset1, fileset2):
-    # TODO: Implement logic to merge two filesets
-    pass
 
 def create_log(category, user, text, conn):
-    query = """
-        INSERT INTO log (category, user, text)
-        VALUES (%s, %s, %s)
-    """
+    query = f"INSERT INTO log (`timestamp`, category, user, `text`) VALUES (FROM_UNIXTIME({int(time.time())}), '{pymysql.escape_string(category)}', '{pymysql.escape_string(user)}', '{pymysql.escape_string(text)}')"
+    with conn.cursor() as cursor:
+        cursor.execute(query)
+        cursor.execute("SELECT LAST_INSERT_ID()")
+        log_last = cursor.fetchone()[0]
+
     try:
+        conn.commit()
+    except:
+        print("Creating log failed")
+
+    return log_last
+
+def calc_key(fileset):
+    key_string = ""
+
+    for key, value in fileset.items():
+        if key in ['engineid', 'gameid', 'rom']:
+            continue
+        key_string += ':' + str(value)
+
+    files = fileset['rom']
+    for file in files:
+        for key, value in file.items():
+            key_string += ':' + str(value)
+
+    key_string = key_string.strip(':')
+    return hashlib.md5(key_string.encode()).hexdigest()
+
+def calc_megakey(files):
+    key_string = ""
+
+    for file in files:
+        for key, value in file.items():
+            key_string += ':' + str(value)
+
+    key_string = key_string.strip(':')
+    return hashlib.md5(key_string.encode()).hexdigest()
+
+def db_insert(data_arr):
+    header = data_arr[0]
+    game_data = data_arr[1]
+    resources = data_arr[2]
+    filepath = data_arr[3]
+
+    conn = db_connect()
+
+    author = header["author"]
+    version = header["version"]
+
+    src = "dat" if author not in ["scan", "scummvm"] else author
+
+    detection = (src == "scummvm")
+    status = "detection" if detection else src
+
+    conn.cursor().execute(f"SET @fileset_time_last = {int(time.time())}")
+
+    transaction_id = conn.cursor().execute("SELECT MAX(`transaction`) FROM transactions").fetchone()[0] + 1
+
+    category_text = f"Uploaded from {src}"
+    log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Transaction: {transaction_id}"
+
+    user = f'cli:{getpass.getuser()}'
+    create_log(pymysql.escape_string(conn, category_text), user, pymysql.escape_string(conn, log_text))
+
+    for fileset in game_data:
+        if detection:
+            engine_name = fileset["engine"]
+            engineid = fileset["sourcefile"]
+            gameid = fileset["name"]
+            title = fileset["title"]
+            extra = fileset["extra"]
+            platform = fileset["platform"]
+            lang = fileset["language"]
+
+            insert_game(engine_name, engineid, title, gameid, extra, platform, lang, conn)
+        elif src == "dat":
+            if 'romof' in fileset and fileset['romof'] in resources:
+                fileset["rom"] = fileset["rom"] + resources[fileset["romof"]]["rom"]
+
+        key = calc_key(fileset) if detection else ""
+        megakey = calc_megakey(fileset['rom']) if not detection else ""
+        log_text = f"size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'."
+
+        if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn):
+            for file in fileset["rom"]:
+                insert_file(file, detection, src, conn)
+                for key, value in file.items():
+                    if key not in ["name", "size"]:
+                        insert_filechecksum(file, key, conn)
+
+    if detection:
+        conn.cursor().execute("UPDATE fileset SET status = 'obsolete' WHERE `timestamp` != FROM_UNIXTIME(@fileset_time_last) AND status = 'detection'")
+
+    fileset_insertion_count = conn.cursor().execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}").fetchone()[0]
+    category_text = f"Uploaded from {src}"
+    log_text = f"Completed loading DAT file, filename '{filepath}', size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
+
+    if not conn.commit():
+        print("Inserting failed")
+    else:
+        user = f'cli:{getpass.getuser()}'
+        create_log(pymysql.escape_string(conn, category_text), user, pymysql.escape_string(conn, log_text))
+
+def compare_filesets(id1, id2, conn):
+    with conn.cursor() as cursor:
+        cursor.execute(f"SELECT name, size, checksum FROM file WHERE fileset = '{id1}'")
+        fileset1 = cursor.fetchall()
+        cursor.execute(f"SELECT name, size, checksum FROM file WHERE fileset = '{id2}'")
+        fileset2 = cursor.fetchall()
+
+    # Sort filesets on checksum
+    fileset1.sort(key=lambda x: x[2])
+    fileset2.sort(key=lambda x: x[2])
+
+    if len(fileset1) != len(fileset2):
+        return False
+
+    for i in range(len(fileset1)):
+        # If checksums do not match
+        if fileset1[i][2] != fileset2[i][2]:
+            return False
+
+    return True
+
+def status_to_match(status):
+    order = ["detection", "dat", "scan", "partialmatch", "fullmatch", "user"]
+    return order[:order.index(status)]
+
+def find_matching_game(game_files):
+    matching_games = []  # All matching games
+    matching_filesets = []  # All filesets containing one file from game_files
+    matches_count = 0  # Number of files with a matching detection entry
+
+    conn = db_connect()
+
+    for file in game_files:
+        checksum = file[1]
+
+        query = f"SELECT file.fileset as file_fileset FROM filechecksum JOIN file ON filechecksum.file = file.id WHERE filechecksum.checksum = '{checksum}' AND file.detection = TRUE"
         with conn.cursor() as cursor:
-            cursor.execute(query, (category, user, text))
+            cursor.execute(query)
+            records = cursor.fetchall()
+
+        # If file is not part of detection entries, skip it
+        if len(records) == 0:
+            continue
+
+        matches_count += 1
+        for record in records:
+            matching_filesets.append(record[0])
+
+    # Check if there is a fileset_id that is present in all results
+    for key, value in Counter(matching_filesets).items():
+        with conn.cursor() as cursor:
+            cursor.execute(f"SELECT COUNT(file.id) FROM file JOIN fileset ON file.fileset = fileset.id WHERE fileset.id = '{key}'")
+            count_files_in_fileset = cursor.fetchone()[0]
+
+        # We use < instead of != since one file may have more than one entry in the fileset
+        # We see this in Drascula English version, where one entry is duplicated
+        if value < matches_count or value < count_files_in_fileset:
+            continue
+
+        with conn.cursor() as cursor:
+            cursor.execute(f"SELECT engineid, game.id, gameid, platform, language, `key`, src, fileset.id as fileset FROM game JOIN fileset ON fileset.game = game.id JOIN engine ON engine.id = game.engine WHERE fileset.id = '{key}'")
+            records = cursor.fetchall()
+
+        matching_games.append(records[0])
+
+    if len(matching_games) != 1:
+        return matching_games
+
+    # Check the current fileset priority with that of the match
+    with conn.cursor() as cursor:
+        cursor.execute(f"SELECT id FROM fileset, ({query}) AS res WHERE id = file_fileset AND status IN ({', '.join(['%s']*len(game_files[3]))})", status_to_match(game_files[3]))
+        records = cursor.fetchall()
+
+    # If priority order is correct
+    if len(records) != 0:
+        return matching_games
+
+    if compare_filesets(matching_games[0]['fileset'], game_files[0][0], conn):
+        with conn.cursor() as cursor:
+            cursor.execute(f"UPDATE fileset SET `delete` = TRUE WHERE id = {game_files[0][0]}")
+        return []
+
+    return matching_games
+
+def merge_filesets(detection_id, dat_id):
+    conn = db_connect()
+
+    with conn.cursor() as cursor:
+        cursor.execute(f"SELECT DISTINCT(filechecksum.checksum), checksize, checktype FROM filechecksum JOIN file on file.id = filechecksum.file WHERE fileset = '{detection_id}'")
+        detection_files = cursor.fetchall()
+
+        for file in detection_files:
+            checksum = file[0]
+            checksize = file[1]
+            checktype = file[2]
+
+            cursor.execute(f"DELETE FROM file WHERE checksum = '{checksum}' AND fileset = {detection_id} LIMIT 1")
+
+            cursor.execute(f"UPDATE file JOIN filechecksum ON filechecksum.file = file.id SET detection = TRUE, checksize = {checksize}, checktype = '{checktype}' WHERE fileset = '{dat_id}' AND filechecksum.checksum = '{checksum}'")
+
+        cursor.execute(f"INSERT INTO history (`timestamp`, fileset, oldfileset) VALUES (FROM_UNIXTIME({int(time.time())}), {dat_id}, {detection_id})")
+        cursor.execute("SELECT LAST_INSERT_ID()")
+        history_last = cursor.fetchone()[0]
+
+        cursor.execute(f"UPDATE history SET fileset = {dat_id} WHERE fileset = {detection_id}")
+
+        cursor.execute(f"DELETE FROM fileset WHERE id = {detection_id}")
+
+        try:
             conn.commit()
-            cursor.execute("SET @log_last = LAST_INSERT_ID()")
-        return True
-    except pymysql.MySQLError as e:
-        print(f"Insert log failed: {e}")
-        return False
+        except:
+            print("Error merging filesets")
+
+    return history_last
+
+
+def populate_matching_games():
+    conn = db_connect()
+
+    # Getting unmatched filesets
+    unmatched_filesets = []
+
+    with conn.cursor() as cursor:
+        cursor.execute("SELECT fileset.id, filechecksum.checksum, src, status FROM fileset JOIN file ON file.fileset = fileset.id JOIN filechecksum ON file.id = filechecksum.file WHERE fileset.game IS NULL AND status != 'user'")
+        unmatched_files = cursor.fetchall()
+
+    # Splitting them into different filesets
+    i = 0
+    while i < len(unmatched_files):
+        cur_fileset = unmatched_files[i][0]
+        temp = []
+        while i < len(unmatched_files) and cur_fileset == unmatched_files[i][0]:
+            temp.append(unmatched_files[i])
+            i += 1
+        unmatched_filesets.append(temp)
+
+    for fileset in unmatched_filesets:
+        matching_games = find_matching_game(fileset)
+
+        if len(matching_games) != 1: # If there is no match/non-unique match
+            continue
+
+        matched_game = matching_games[0]
+
+        # Update status depending on $matched_game["src"] (dat -> partialmatch, scan -> fullmatch)
+        status = fileset[0][2]
+        if fileset[0][2] == "dat":
+            status = "partialmatch"
+        elif fileset[0][2] == "scan":
+            status = "fullmatch"
+
+        # Convert NULL values to string with value NULL for printing
+        matched_game = {k: 'NULL' if v is None else v for k, v in matched_game.items()}
+
+        category_text = f"Matched from {fileset[0][2]}"
+        log_text = f"Matched game {matched_game['engineid']}:\n{matched_game['gameid']}-{matched_game['platform']}-{matched_game['language']}\nvariant {matched_game['key']}. State {status}. Fileset:{fileset[0][0]}."
+
+        # Updating the fileset.game value to be $matched_game["id"]
+        query = f"UPDATE fileset SET game = {matched_game['id']}, status = '{status}', `key` = '{matched_game['key']}' WHERE id = {fileset[0][0]}"
+
+        history_last = merge_filesets(matched_game["fileset"], fileset[0][0])
+
+        if cursor.execute(query):
+            user = f'cli:{getpass.getuser()}'
+
+            # Merge log
+            create_log("Fileset merge", user, pymysql.escape_string(conn, f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0][0]}"))
+
+            # Matching log
+            log_last = create_log(pymysql.escape_string(conn, category_text), user, pymysql.escape_string(conn, log_text))
+
+            # Add log id to the history table
+            cursor.execute(f"UPDATE history SET log = {log_last} WHERE id = {history_last}")
 
-def get_current_user():
-    # Implement logic to get current user
-    pass
\ No newline at end of file
+        if not conn.commit():
+            print("Updating matched games failed")
\ No newline at end of file


Commit: ab9096560f6bd4389f867269d6a033885691a771
    https://github.com/scummvm/scummvm-sites/commit/ab9096560f6bd4389f867269d6a033885691a771
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:40:20+08:00

Commit Message:
INTEGRITY: Implement of user_fileset_functions.py

Changed paths:
    user_fileset_functions.py


diff --git a/user_fileset_functions.py b/user_fileset_functions.py
index 6ff10f9..de4dfd3 100644
--- a/user_fileset_functions.py
+++ b/user_fileset_functions.py
@@ -1,6 +1,8 @@
 import hashlib
 import time
 from db_functions import db_connect, insert_fileset, insert_file, insert_filechecksum, find_matching_game, merge_filesets, create_log, get_current_user
+import getpass
+import pymysql
 
 def user_calc_key(user_fileset):
     key_string = ""
@@ -25,10 +27,11 @@ def file_json_to_array(file_json_object):
     return res
 
 def user_insert_queue(user_fileset, conn):
-    query = "INSERT INTO queue (time, notes, fileset, ticketid, userid, commit) VALUES (%s, NULL, @fileset_last, NULL, NULL, NULL)"
+    query = f"INSERT INTO queue (time, notes, fileset, ticketid, userid, commit) VALUES ({int(time.time())}, NULL, @fileset_last, NULL, NULL, NULL)"
+
     with conn.cursor() as cursor:
-        cursor.execute(query, (int(time.time()),))
-    conn.commit()
+        cursor.execute(query)
+        conn.commit()
 
 def user_insert_fileset(user_fileset, ip, conn):
     src = 'user'
@@ -52,7 +55,61 @@ def user_insert_fileset(user_fileset, ip, conn):
     conn.commit()
     return fileset_id
 
-def match_and_merge_user_filesets(id, conn):
+def match_and_merge_user_filesets(id):
+    conn = db_connect()
+
+    # Getting unmatched filesets
+    unmatched_filesets = []
+
+    with conn.cursor() as cursor:
+        cursor.execute(f"SELECT fileset.id, filechecksum.checksum, src, status FROM fileset JOIN file ON file.fileset = fileset.id JOIN filechecksum ON file.id = filechecksum.file WHERE status = 'user' AND fileset.id = {id}")
+        unmatched_files = cursor.fetchall()
+
+    # Splitting them into different filesets
+    i = 0
+    while i < len(unmatched_files):
+        cur_fileset = unmatched_files[i][0]
+        temp = []
+        while i < len(unmatched_files) and cur_fileset == unmatched_files[i][0]:
+            temp.append(unmatched_files[i])
+            i += 1
+        unmatched_filesets.append(temp)
+
+    for fileset in unmatched_filesets:
+        matching_games = find_matching_game(fileset)
+
+        if len(matching_games) != 1: # If there is no match/non-unique match
+            continue
+
+        matched_game = matching_games[0]
+
+        status = 'fullmatch'
+
+        # Convert NULL values to string with value NULL for printing
+        matched_game = {k: 'NULL' if v is None else v for k, v in matched_game.items()}
+
+        category_text = f"Matched from {fileset[0][2]}"
+        log_text = f"Matched game {matched_game['engineid']}:\n{matched_game['gameid']}-{matched_game['platform']}-{matched_game['language']}\nvariant {matched_game['key']}. State {status}. Fileset:{fileset[0][0]}."
+
+        # Updating the fileset.game value to be $matched_game["id"]
+        query = f"UPDATE fileset SET game = {matched_game['id']}, status = '{status}', `key` = '{matched_game['key']}' WHERE id = {fileset[0][0]}"
+
+        history_last = merge_filesets(matched_game["fileset"], fileset[0][0])
+
+        if cursor.execute(query):
+            user = f'cli:{getpass.getuser()}'
+
+            # Merge log
+            create_log("Fileset merge", user, pymysql.escape_string(conn, f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0][0]}"))
+
+            # Matching log
+            log_last = create_log(pymysql.escape_string(conn, category_text), user, pymysql.escape_string(conn, log_text))
+
+            # Add log id to the history table
+            cursor.execute(f"UPDATE history SET log = {log_last} WHERE id = {history_last}")
+
+        if not conn.commit():
+            print("Updating matched games failed")
     with conn.cursor() as cursor:
         cursor.execute("""
             SELECT fileset.id, filechecksum.checksum, src, status
@@ -98,31 +155,3 @@ def match_and_merge_user_filesets(id, conn):
             log_last = create_log(category_text, user, log_text)
             cursor.execute("UPDATE history SET log = %s WHERE id = %s", (log_last, history_last))
         conn.commit()
-
-def insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, ip):
-    
-    pass
-
-def insert_file(file, detection, src, conn):
-    
-    pass
-
-def insert_filechecksum(file, key, conn):
-    
-    pass
-
-def find_matching_game(fileset):
-    
-    pass
-
-def merge_filesets(fileset1, fileset2):
-    
-    pass
-
-def create_log(category, user, text):
-    
-    pass
-
-def get_current_user():
-    
-    pass
\ No newline at end of file


Commit: a16498927bb779f3dd6567b72c5335457bdb7234
    https://github.com/scummvm/scummvm-sites/commit/a16498927bb779f3dd6567b72c5335457bdb7234
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:40:20+08:00

Commit Message:
INTEGRITY: Fix fileset page in fileset.py, fix create page func in pagination.py

Changed paths:
    fileset.py
    pagination.py
    user_fileset_functions.py


diff --git a/fileset.py b/fileset.py
index 90b6852..62b34d0 100644
--- a/fileset.py
+++ b/fileset.py
@@ -22,188 +22,13 @@ conn = pymysql.connect(
 
 @app.route('/fileset', methods=['GET', 'POST'])
 def fileset():
-    id = request.args.get('id')
-    with conn.cursor() as cursor:
-        cursor.execute("SELECT MIN(id) AS min_id FROM fileset")
-        min_id = cursor.fetchone()['min_id']
-        
-        if not id:
-            id = min_id
-        else:
-            cursor.execute("SELECT MAX(id) AS max_id FROM fileset")
-            max_id = cursor.fetchone()['max_id']
-            id = max(min_id, min(int(id), max_id))
-            cursor.execute("SELECT id FROM fileset WHERE id = %s", (id,))
-            if cursor.rowcount == 0:
-                cursor.execute("SELECT fileset FROM history WHERE oldfileset = %s", (id,))
-                id = cursor.fetchone()['fileset']
+    id = request.args.get('id', default = 1, type = int)
+    widetable = request.args.get('widetable', default = 'false', type = str)
+    # Load MySQL credentials from a JSON file
+    with open('mysql_config.json') as f:
+        mysql_cred = json.load(f)
 
-        cursor.execute("SELECT * FROM fileset WHERE id = %s", (id,))
-        result = cursor.fetchone()
-
-        if result['game']:
-            cursor.execute("""
-                SELECT game.name AS 'game name', engineid, gameid, extra, platform, language
-                FROM fileset
-                JOIN game ON game.id = fileset.game
-                JOIN engine ON engine.id = game.engine
-                WHERE fileset.id = %s
-            """, (id,))
-            result.update(cursor.fetchone())
-        else:
-            result.pop('key', None)
-            result.pop('status', None)
-            result.pop('delete', None)
-
-        fileset_details = result
-
-        cursor.execute("SELECT file.id, name, size, checksum, detection FROM file WHERE fileset = %s", (id,))
-        files = cursor.fetchall()
-
-        if request.args.get('widetable') == 'true':
-            for file in files:
-                cursor.execute("SELECT checksum, checksize, checktype FROM filechecksum WHERE file = %s", (file['id'],))
-                checksums = cursor.fetchall()
-                for checksum in checksums:
-                    if checksum['checksize'] != 0:
-                        file[f"{checksum['checktype']}-{checksum['checksize']}"] = checksum['checksum']
-
-        cursor.execute("""
-            SELECT `timestamp`, oldfileset, log
-            FROM history
-            WHERE fileset = %s
-            ORDER BY `timestamp`
-        """, (id,))
-        history = cursor.fetchall()
-
-        cursor.execute("""
-            SELECT `timestamp`, category, `text`, id
-            FROM log
-            WHERE `text` REGEXP %s
-            ORDER BY `timestamp` DESC, id DESC
-        """, (f'Fileset:{id}',))
-        logs = cursor.fetchall()
-
-        for history_row in history:
-            cursor.execute("""
-                SELECT `timestamp`, category, `text`, id
-                FROM log
-                WHERE `text` REGEXP %s
-                AND `category` NOT REGEXP 'merge'
-                ORDER BY `timestamp` DESC, id DESC
-            """, (f'Fileset:{history_row["oldfileset"]}',))
-            logs.extend(cursor.fetchall())
-
-    if request.method == 'POST':
-        if 'delete' in request.form:
-            with conn.cursor() as cursor:
-                cursor.execute("UPDATE fileset SET `delete` = TRUE WHERE id = %s", (request.form['delete'],))
-                conn.commit()
-        if 'match' in request.form:
-            match_and_merge_user_filesets(request.form['match'])
-            return redirect(url_for('fileset', id=request.form['match']))
-
-    return render_template_string("""
-    <!DOCTYPE html>
-    <html>
-    <head>
-        <link rel="stylesheet" href="{{ stylesheet }}">
-        <script type="text/javascript" src="{{ jquery_file }}"></script>
-        <script type="text/javascript" src="{{ js_file }}"></script>
-    </head>
-    <body>
-        <h2><u>Fileset: {{ id }}</u></h2>
-        <h3>Fileset details</h3>
-        <table>
-            {% for key, value in fileset_details.items() %}
-                {% if key not in ['id', 'game'] %}
-                    <tr><th>{{ key }}</th><td>{{ value }}</td></tr>
-                {% endif %}
-            {% endfor %}
-        </table>
-        <h3>Files in the fileset</h3>
-        <form method="get">
-            {% for key, value in request.args.items() %}
-                {% if key != 'widetable' %}
-                    <input type="hidden" name="{{ key }}" value="{{ value }}">
-                {% endif %}
-            {% endfor %}
-            {% if request.args.get('widetable') == 'true' %}
-                <input type="hidden" name="widetable" value="false">
-                <input type="submit" value="Hide extra checksums">
-            {% else %}
-                <input type="hidden" name="widetable" value="true">
-                <input type="submit" value="Expand Table">
-            {% endif %}
-        </form>
-        <table>
-            {% if files %}
-                <tr>
-                    <th>#</th>
-                    {% for key in files[0].keys() %}
-                        {% if key != 'id' %}
-                            <th>{{ key }}</th>
-                        {% endif %}
-                    {% endfor %}
-                </tr>
-                {% for i, file in enumerate(files, 1) %}
-                    <tr>
-                        <td>{{ i }}</td>
-                        {% for key, value in file.items() %}
-                            {% if key != 'id' %}
-                                <td>{{ value }}</td>
-                            {% endif %}
-                        {% endfor %}
-                    </tr>
-                {% endfor %}
-            {% endif %}
-        </table>
-        <h3>Developer Actions</h3>
-        <form method="post">
-            <button type="submit" name="delete" value="{{ id }}">Mark Fileset for Deletion</button>
-            <button type="submit" name="match" value="{{ id }}">Match and Merge Fileset</button>
-        </form>
-        <h3>Fileset history</h3>
-        <table>
-            <tr>
-                <th>Timestamp</th>
-                <th>Category</th>
-                <th>Description</th>
-                <th>Log ID</th>
-            </tr>
-            {% for log in logs %}
-                <tr>
-                    <td>{{ log.timestamp }}</td>
-                    <td>{{ log.category }}</td>
-                    <td>{{ log.text }}</td>
-                    <td><a href="logs.php?id={{ log.id }}">{{ log.id }}</a></td>
-                </tr>
-            {% endfor %}
-        </table>
-    </body>
-    </html>
-    """, id=id, fileset_details=fileset_details, files=files, logs=logs, stylesheet='style.css', jquery_file='https://code.jquery.com/jquery-3.7.0.min.js', js_file='js_functions.js')
-
-
-def get_join_columns(table1, table2, mapping):
-    for primary, foreign in mapping.items():
-        primary = primary.split('.')
-        foreign = foreign.split('.')
-        if (primary[0] == table1 and foreign[0] == table2) or (primary[0] == table2 and foreign[0] == table1):
-            return f"{primary[0]}.{primary[1]} = {foreign[0]}.{foreign[1]}"
-    return "No primary-foreign key mapping provided. Filter is invalid"
-
- at app.route('/create_page', methods=['GET'])
-def create_page():
-    filename = 'filename'
-    results_per_page = 10
-    records_table = 'records_table'
-    select_query = 'select_query'
-    order = 'order'
-    filters = {}
-    mapping = {}
-
-    mysql_cred = json.load(open(os.path.join(os.path.dirname(__file__), '../mysql_config.json')))
+    # Create a connection to the MySQL server
     connection = pymysql.connect(host=mysql_cred["servername"],
                                  user=mysql_cred["username"],
                                  password=mysql_cred["password"],
@@ -211,17 +36,142 @@ def create_page():
                                  charset='utf8mb4',
                                  cursorclass=pymysql.cursors.DictCursor)
 
-    with connection.cursor() as cursor:
-        # TODO: Implement the logic to handle the GET parameters and construct the SQL query
-        # similar logic as the PHP code to handle the GET parameters, construct the SQL query, execute it and fetch the results
-        # ...
-        pass
-    
-    # TODO: Implement the logic to construct the HTML table and pagination elements
-    # similar logic as the PHP code to construct the HTML table and pagination elements
-    # ...
-
-    return render_template("fileset.html")
+    try:
+        with connection.cursor() as cursor:
+            # Get the minimum id from the fileset table
+            cursor.execute("SELECT MIN(id) FROM fileset")
+            min_id = cursor.fetchone()['MIN(id)']
+
+            # Get the id from the GET parameters, or use the minimum id if it's not provided
+            id = request.args.get('id', default=min_id, type=int)
+
+            # Get the maximum id from the fileset table
+            cursor.execute("SELECT MAX(id) FROM fileset")
+            max_id = cursor.fetchone()['MAX(id)']
+
+            # Ensure the id is between the minimum and maximum id
+            id = max(min_id, min(id, max_id))
+
+            # Check if the id exists in the fileset table
+            cursor.execute(f"SELECT id FROM fileset WHERE id = {id}")
+            if cursor.rowcount == 0:
+                # If the id doesn't exist, get a new id from the history table
+                cursor.execute(f"SELECT fileset FROM history WHERE oldfileset = {id}")
+                id = cursor.fetchone()['fileset']
+
+            # Get the history for the current id
+            cursor.execute(f"SELECT `timestamp`, oldfileset, log FROM history WHERE fileset = {id} ORDER BY `timestamp`")
+            history = cursor.fetchall()
+
+            # Display fileset details
+            html = f"<h2><u>Fileset: {id}</u></h2>"
+
+            cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
+            result = cursor.fetchone()
+
+            html += "<h3>Fileset details</h3>"
+            html += "<table>\n"
+            if result['game']:
+                cursor.execute(f"SELECT game.name as 'game name', engineid, gameid, extra, platform, language FROM fileset JOIN game ON game.id = fileset.game JOIN engine ON engine.id = game.engine WHERE fileset.id = {id}")
+                result = {**result, **cursor.fetchone()}
+            else:
+                result.pop('key', None)
+                result.pop('status', None)
+                result.pop('delete', None)
+
+            for column in result.keys():
+                if column != 'id' and column != 'game':
+                    html += f"<th>{column}</th>\n"
+
+            html += "<tr>\n"
+            for column, value in result.items():
+                if column != 'id' and column != 'game':
+                    html += f"<td>{value}</td>"
+            html += "</tr>\n"
+            html += "</table>\n"
+
+            # Files in the fileset
+            html += "<h3>Files in the fileset</h3>"
+            html += "<form>"
+            for k, v in request.args.items():
+                if k != 'widetable':
+                    html += f"<input type='hidden' name='{k}' value='{v}'>"
+            if widetable == 'true':
+                html += "<input class='hidden' type='text' name='widetable' value='false' />"
+                html += "<input type='submit' value='Hide extra checksums' />"
+            else:
+                html += "<input class='hidden' type='text' name='widetable' value='true' />"
+                html += "<input type='submit' value='Expand Table' />"
+            html += "</form>"
+
+            # Table
+            html += "<table>\n"
+
+            cursor.execute(f"SELECT file.id, name, size, checksum, detection FROM file WHERE fileset = {id}")
+            result = cursor.fetchall()
+
+            if widetable == 'true':
+                for index, file in enumerate(result):
+                    cursor.execute(f"SELECT checksum, checksize, checktype FROM filechecksum WHERE file = {file['id']}")
+                    while True:
+                        spec_checksum = cursor.fetchone()
+                        if spec_checksum is None:
+                            break
+                        if spec_checksum['checksize'] == 0:
+                            continue
+                        result[index][f"{spec_checksum['checktype']}-{spec_checksum['checksize']}"] = spec_checksum['checksum']
+
+            counter = 1
+            for row in result:
+                if counter == 1:
+                    html += "<th/>\n" # Numbering column
+                    for key in row.keys():
+                        if key != 'id':
+                            html += f"<th>{key}</th>\n"
+                html += "<tr>\n"
+                html += f"<td>{counter}.</td>\n"
+                for key, value in row.items():
+                    if key != 'id':
+                        html += f"<td>{value}</td>\n"
+                html += "</tr>\n"
+                counter += 1
+            html += "</table>\n"
+
+            # Generate the HTML for the developer actions
+            html += "<h3>Developer Actions</h3>"
+            html += f"<button id='delete-button' type='button' onclick='delete_id({id})'>Mark Fileset for Deletion</button>"
+            html += f"<button id='match-button' type='button' onclick='match_id({id})'>Match and Merge Fileset</button>"
+
+            if 'delete' in request.form:
+                cursor.execute(f"UPDATE fileset SET `delete` = TRUE WHERE id = {request.form['delete']}")
+                connection.commit()
+                html += "<p id='delete-confirm'>Fileset marked for deletion</p>"
+
+            if 'match' in request.form:
+                match_and_merge_user_filesets(request.form['match'])
+                return redirect(url_for('fileset', id=request.form['match']))
+
+            # Generate the HTML for the fileset history
+            cursor.execute(f"SELECT `timestamp`, category, `text`, id FROM log WHERE `text` REGEXP 'Fileset:{id}' ORDER BY `timestamp` DESC, id DESC")
+            logs = cursor.fetchall()
+
+            html += "<h3>Fileset history</h3>"
+            html += "<table>\n"
+            html += "<th>Timestamp</th>\n"
+            html += "<th>Category</th>\n"
+            html += "<th>Description</th>\n"
+            html += "<th>Log ID</th>\n"
+            for log in logs:
+                html += "<tr>\n"
+                html += f"<td>{log['timestamp']}</td>\n"
+                html += f"<td>{log['category']}</td>\n"
+                html += f"<td>{log['text']}</td>\n"
+                html += f"<td><a href='logs.php?id={log['id']}'>{log['id']}</a></td>\n"
+                html += "</tr>\n"
+            html += "</table>\n"
+            return render_template_string(html)
+    finally:
+        connection.close()
 
 if __name__ == '__main__':
     app.run()
\ No newline at end of file
diff --git a/pagination.py b/pagination.py
index e0ef77a..f5adcb0 100644
--- a/pagination.py
+++ b/pagination.py
@@ -1,8 +1,19 @@
 from urllib.parse import urlencode
 import pymysql
 import json
-import math
 from flask import Flask, request, render_template_string
+import os
+import re
+from math import ceil
+import html
+
+stylesheet = 'style.css'
+jquery_file = 'https://code.jquery.com/jquery-3.7.0.min.js'
+js_file = 'js_functions.js'
+print(f"<link rel='stylesheet' href='{stylesheet}'>\n")
+print(f"<script type='text/javascript' src='{jquery_file}'></script>\n")
+print(f"<script type='text/javascript' src='{js_file}'></script>\n")
+
 
 with open('mysql_config.json') as f:
     mysql_cred = json.load(f)
@@ -26,127 +37,202 @@ def get_join_columns(table1, table2, mapping):
             return f"{primary[0]}.{primary[1]} = {foreign[0]}.{foreign[1]}"
     raise ValueError("No primary-foreign key mapping provided. Filter is invalid")
 
-def create_page(filename, results_per_page, records_table, select_query, order, filters=None, mapping=None):
-    if filters is None:
-        filters = {}
-    if mapping is None:
-        mapping = {}
-
-    with conn.cursor() as cursor:
-        # If there exist get variables that are for filtering
-        get_params = {k: v for k, v in request.args.items() if v}
-        if 'sort' in get_params:
-            column = get_params.pop('sort')
-            order = f"ORDER BY {column.split('-')[0]}"
-            if 'desc' in column:
-                order += " DESC"
-
-        tables = list(set(filters.values()))
-        condition = "WHERE " + " AND ".join([f"{filters[k]}.{k} REGEXP '{v}'" for k, v in get_params.items() if k != 'page']) if get_params else ""
-        
+def create_page(filename, results_per_page, records_table, select_query, order, filters = {}, mapping = {}):
+    with open(os.path.join(os.path.dirname(__file__), '../mysql_config.json')) as f:
+        mysql_cred = json.load(f)
+
+    conn = pymysql.connect(
+        host=mysql_cred["servername"],
+        user=mysql_cred["username"],
+        password=mysql_cred["password"],
+        db=mysql_cred["dbname"],
+        charset='utf8mb4',
+        cursorclass=pymysql.cursors.DictCursor,
+        autocommit=False
+    )
+
+    # Check connection
+    if not conn.open:
+        print("Connect failed.")
+        return
+
+    # If there exist get variables that are for filtering
+    get = {k: v for k, v in request.args.items() if v != ''}
+    if 'sort' in get:
+        column = get['sort'].split('-')
+        order = "ORDER BY {}".format(column[0])
+
+        if 'desc' in get['sort']:
+            order += " DESC"
+
+    if set(get.keys()) - set(['page', 'sort']):
+        condition = "WHERE "
+        tables = []
+        for key, value in get.items():
+            if key in ['page', 'sort'] or value == '':
+                continue
+
+            tables.append(filters[key])
+            condition += " AND {}.{} REGEXP '{}'".format(filters[key], key, value) if condition != "WHERE " else "{}.{} REGEXP '{}'".format(filters[key], key, value)
+        if condition == "WHERE ":
+            condition = ""
+
+        # If more than one table is to be searched
         from_query = records_table
-        if len(tables) > 1 or (tables and tables[0] != records_table):
-            for table in tables:
-                if table != records_table:
-                    from_query += f" JOIN {table} ON {get_join_columns(records_table, table, mapping)}"
-
-        count_query = f"SELECT COUNT({records_table}.id) FROM {from_query} {condition}"
-        cursor.execute(count_query)
-        num_of_results = cursor.fetchone()['COUNT({records_table}.id)']
-        num_of_pages = math.ceil(num_of_results / results_per_page)
-
-        page = max(1, min(int(get_params.pop('page', 1)), num_of_pages))
-        offset = (page - 1) * results_per_page
-
-        query = f"{select_query} {condition} {order} LIMIT {results_per_page} OFFSET {offset}"
-        cursor.execute(query)
-        results = cursor.fetchall()
-
-    return render_template_string("""
-    <!DOCTYPE html>
-    <html>
-    <head>
-        <link rel="stylesheet" href="{{ stylesheet }}">
-        <script type="text/javascript" src="{{ jquery_file }}"></script>
-        <script type="text/javascript" src="{{ js_file }}"></script>
-    </head>
-    <body>
-        <form id='filters-form' method='GET' onsubmit='remove_empty_inputs()'>
-        <table>
-            {% if results %}
-                <tr class="filter">
-                    <td></td>
-                    {% for key in results[0].keys() %}
-                        {% if key in filters %}
-                            <td class="filter">
-                                <input type="text" class="filter" placeholder="{{ key }}" name="{{ key }}" value="{{ request.args.get(key, '') }}"/>
-                            </td>
-                        {% else %}
-                            <td class="filter"></td>
-                        {% endif %}
-                    {% endfor %}
-                </tr>
-                <tr class="filter">
-                    <td></td>
-                    <td class="filter"><input type="submit" value="Submit"></td>
-                </tr>
-                <tr>
-                    <th></th>
-                    {% for key in results[0].keys() %}
-                        {% if key != 'fileset' %}
-                            <th><a href="{{ url_for('create_page', **{**request.args, 'sort': key}) }}">{{ key }}</a></th>
-                        {% endif %}
-                    {% endfor %}
-                </tr>
-                {% for i, row in enumerate(results, start=offset+1) %}
-                    <tr>
-                        <td>{{ i }}</td>
-                        {% for key, value in row.items() %}
-                            {% if key != 'fileset' %}
-                                <td>{{ value }}</td>
-                            {% endif %}
-                        {% endfor %}
-                    </tr>
-                {% endfor %}
-            {% endif %}
-        </table>
-        </form>
-
-        <div class="pagination">
-            {% if num_of_pages > 1 %}
-                <form method="GET">
-                    {% for key, value in request.args.items() %}
-                        {% if key != 'page' %}
-                            <input type="hidden" name="{{ key }}" value="{{ value }}">
-                        {% endif %}
-                    {% endfor %}
-                    {% if page > 1 %}
-                        <a href="{{ url_for('create_page', **{**request.args, 'page': 1}) }}">❮❮</a>
-                        <a href="{{ url_for('create_page', **{**request.args, 'page': page-1}) }}">❮</a>
-                    {% endif %}
-                    {% if page - 2 > 1 %}
-                        <div class="more">...</div>
-                    {% endif %}
-                    {% for i in range(max(1, page-2), min(num_of_pages+1, page+3)) %}
-                        {% if i == page %}
-                            <a class="active" href="{{ url_for('create_page', **{**request.args, 'page': i}) }}">{{ i }}</a>
-                        {% else %}
-                            <a href="{{ url_for('create_page', **{**request.args, 'page': i}) }}">{{ i }}</a>
-                        {% endif %}
-                    {% endfor %}
-                    {% if page + 2 < num_of_pages %}
-                        <div class="more">...</div>
-                    {% endif %}
-                    {% if page < num_of_pages %}
-                        <a href="{{ url_for('create_page', **{**request.args, 'page': page+1}) }}">❯</a>
-                        <a href="{{ url_for('create_page', **{**request.args, 'page': num_of_pages}) }}">❯❯</a>
-                    {% endif %}
-                    <input type="text" name="page" placeholder="Page No">
-                    <input type="submit" value="Submit">
-                </form>
-            {% endif %}
-        </div>
-    </body>
-    </html>
-    """, results=results, filters=filters, request=request.args, offset=offset, num_of_pages=num_of_pages, page=page, filename=filename, stylesheet='style.css', jquery_file='https://code.jquery.com/jquery-3.7.0.min.js', js_file='js_functions.js')
+        if len(tables) > 1 or tables[0] != records_table:
+            for i in range(len(tables)):
+                if tables[i] == records_table:
+                    continue
+
+                from_query += " JOIN {} ON {}".format(tables[i], get_join_columns(records_table, tables[i], mapping))
+
+        cursor = conn.cursor()
+        cursor.execute("SELECT COUNT({}.id) FROM {} {}".format(records_table, from_query, condition))
+        num_of_results = cursor.fetchone()[0]
+    # If $records_table has a JOIN (multiple tables)
+    elif re.search("JOIN", records_table):
+        first_table = records_table.split(" ")[0]
+        cursor = conn.cursor()
+        cursor.execute("SELECT COUNT({}.id) FROM {}".format(first_table, records_table))
+        num_of_results = cursor.fetchone()[0]
+    else:
+        cursor = conn.cursor()
+        cursor.execute("SELECT COUNT(id) FROM {}".format(records_table))
+        num_of_results = cursor.fetchone()[0]
+    num_of_pages = ceil(num_of_results / results_per_page)
+    if num_of_results == 0:
+        print("No results for given filters")
+        return
+
+    if 'page' not in get:
+        page = 1
+    else:
+        page = max(1, min(int(get['page']), num_of_pages))
+
+    offset = (page - 1) * results_per_page
+
+    # If there exist get variables that are for filtering
+    if set(get.keys()) - set(['page']):
+        condition = "WHERE "
+        for key, value in get.items():
+            value = conn.converter.escape(value)
+            if key not in filters:
+                continue
+
+            condition += "AND {}.{} REGEXP '{}'".format(filters[key], key, value) if condition != "WHERE " else "{}.{} REGEXP '{}'".format(filters[key], key, value)
+        if condition == "WHERE ":
+            condition = ""
+
+        query = "{} {} {} LIMIT {} OFFSET {}".format(select_query, condition, order, results_per_page, offset)
+    else:
+        query = "{} {} LIMIT {} OFFSET {}".format(select_query, order, results_per_page, offset)
+    cursor = conn.cursor()
+    cursor.execute(query)
+
+    # Table
+    print("<form id='filters-form' method='GET' onsubmit='remove_empty_inputs()'>")
+    print("<table>")
+
+    counter = offset + 1
+    for row in cursor.fetchall():
+        if counter == offset + 1: # If it is the first run of the loop
+            if len(filters) > 0:
+                print("<tr class=filter><td></td>")
+                for key in row.keys():
+                    if key not in filters:
+                        print("<td class=filter />")
+                        continue
+
+                    # Filter textbox
+                    filter_value = get[key] if key in get else ""
+
+                    print("<td class=filter><input type=text class=filter placeholder='{}' name='{}' value='{}'/></td>".format(key, key, filter_value))
+                print("</tr>")
+                print("<tr class=filter><td></td><td class=filter><input type=submit value='Submit'></td></tr>")
+
+            print("<th/>") # Numbering column
+            for key in row.keys():
+                if key == 'fileset':
+                    continue
+
+                # Preserve GET variables
+                vars = ""
+                for k, v in get.items():
+                    if k == 'sort' and v == key:
+                        vars += "&{}={}-desc".format(k, v)
+                    elif k != 'sort':
+                        vars += "&{}={}".format(k, v)
+
+                if "&sort={}".format(key) not in vars:
+                    print("<th><a href='{}?{}&sort={}'>{}</th>".format(filename, vars, key, key))
+                else:
+                    print("<th><a href='{}?{}'>{}</th>".format(filename, vars, key))
+
+        if filename in ['games_list.php', 'user_games_list.php']:
+            print("<tr class=games_list onclick='hyperlink(\"fileset.php?id={}\")'>".format(row['fileset']))
+        else:
+            print("<tr>")
+        print("<td>{}.</td>".format(counter))
+        for key, value in row.items():
+            if key == 'fileset':
+                continue
+
+            # Add links to fileset in logs table
+            matches = re.search("Fileset:(\d+)", value)
+            if matches:
+                value = value[:matches.start()] + "<a href='fileset.php?id={}'>{}</a>".format(matches.group(1), matches.group(0)) + value[matches.end():]
+
+            print("<td>{}</td>".format(value))
+        print("</tr>")
+
+        counter += 1
+
+    print("</table>")
+    print("</form>")
+
+    # Preserve GET variables
+    vars = ""
+    for key, value in get.items():
+        if key == 'page':
+            continue
+        vars += "&{}={}".format(key, value)
+
+    # Navigation elements
+    if num_of_pages > 1:
+        print("<form method='GET'>")
+
+        # Preserve GET variables on form submit
+        for key, value in get.items():
+            if key == 'page':
+                continue
+
+            key = html.escape(key)
+            value = html.escape(value)
+            if value != "":
+                print("<input type='hidden' name='{}' value='{}'>".format(key, value))
+
+        print("<div class=pagination>")
+        if page > 1:
+            print("<a href={}{}>❮❮</a>".format(filename, vars))
+            print("<a href={}page={}{}>❮</a>".format(filename, page - 1, vars))
+        if page - 2 > 1:
+            print("<div class=more>...</div>")
+
+        for i in range(page - 2, page + 3):
+            if i >= 1 and i <= num_of_pages:
+                if i == page:
+                    print("<a class=active href={}page={}{}>{}</a>".format(filename, i, vars, i))
+                else:
+                    print("<a href={}page={}{}>{}</a>".format(filename, i, vars, i))
+
+        if page + 2 < num_of_pages:
+            print("<div class=more>...</div>")
+        if page < num_of_pages:
+            print("<a href={}page={}{}>❯</a>".format(filename, page + 1, vars))
+            print("<a href={}page={}{}>❯❯</a>".format(filename, num_of_pages, vars))
+
+        print("<input type='text' name='page' placeholder='Page No'>")
+        print("<input type='submit' value='Submit'>")
+        print("</div>")
 
+        print("</form>")
diff --git a/user_fileset_functions.py b/user_fileset_functions.py
index de4dfd3..88d4a83 100644
--- a/user_fileset_functions.py
+++ b/user_fileset_functions.py
@@ -1,6 +1,6 @@
 import hashlib
 import time
-from db_functions import db_connect, insert_fileset, insert_file, insert_filechecksum, find_matching_game, merge_filesets, create_log, get_current_user
+from db_functions import db_connect, insert_fileset, insert_file, insert_filechecksum, find_matching_game, merge_filesets, create_log
 import getpass
 import pymysql
 
@@ -150,7 +150,7 @@ def match_and_merge_user_filesets(id):
         history_last = merge_filesets(matched_game["fileset"], fileset[0]['id'])
         with conn.cursor() as cursor:
             cursor.execute(query, (matched_game["id"], status, matched_game["key"], fileset[0]['id']))
-            user = 'cli:' + get_current_user()
+            user = 'cli:' + getpass.getuser()
             create_log("Fileset merge", user, f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0]['id']}")
             log_last = create_log(category_text, user, log_text)
             cursor.execute("UPDATE history SET log = %s WHERE id = %s", (log_last, history_last))


Commit: bffdb5046fee908ecefe40e231184c4be40797f2
    https://github.com/scummvm/scummvm-sites/commit/bffdb5046fee908ecefe40e231184c4be40797f2
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:40:20+08:00

Commit Message:
INTEGRITY: Add schema.py to generate db tables

Changed paths:
  A schema.py
    fileset.py


diff --git a/fileset.py b/fileset.py
index 62b34d0..49d3bc5 100644
--- a/fileset.py
+++ b/fileset.py
@@ -161,13 +161,18 @@ def fileset():
             html += "<th>Category</th>\n"
             html += "<th>Description</th>\n"
             html += "<th>Log ID</th>\n"
-            for log in logs:
-                html += "<tr>\n"
-                html += f"<td>{log['timestamp']}</td>\n"
-                html += f"<td>{log['category']}</td>\n"
-                html += f"<td>{log['text']}</td>\n"
-                html += f"<td><a href='logs.php?id={log['id']}'>{log['id']}</a></td>\n"
-                html += "</tr>\n"
+            cursor.execute(f"SELECT * FROM history")
+            history = cursor.fetchall()
+            for history_row in history:
+                cursor.execute(f"SELECT `timestamp`, category, `text`, id FROM log WHERE `text` LIKE 'Fileset:{history_row['oldfileset']}%' AND `category` NOT LIKE 'merge%' ORDER BY `timestamp` DESC, id DESC")
+                logs = cursor.fetchall()
+                for log in logs:
+                    html += "<tr>\n"
+                    html += f"<td>{log['timestamp']}</td>\n"
+                    html += f"<td>{log['category']}</td>\n"
+                    html += f"<td>{log['text']}</td>\n"
+                    html += f"<td><a href='logs.php?id={log['id']}'>{log['id']}</a></td>\n"
+                    html += "</tr>\n"
             html += "</table>\n"
             return render_template_string(html)
     finally:
diff --git a/schema.py b/schema.py
new file mode 100644
index 0000000..dee08f4
--- /dev/null
+++ b/schema.py
@@ -0,0 +1,218 @@
+import json
+import pymysql
+import random
+import string
+from datetime import datetime
+
+# Load MySQL credentials
+with open(__file__ + '/../mysql_config.json') as f:
+    mysql_cred = json.load(f)
+
+servername = mysql_cred["servername"]
+username = mysql_cred["username"]
+password = mysql_cred["password"]
+dbname = mysql_cred["dbname"]
+
+# Create connection
+conn = pymysql.connect(
+    host=servername,
+    user=username,
+    password=password,
+    charset='utf8mb4',
+    cursorclass=pymysql.cursors.DictCursor,
+    autocommit=False
+)
+
+# Check connection
+if conn is None:
+    print("Error connecting to MySQL")
+    exit(1)
+
+cursor = conn.cursor()
+
+# Create database
+sql = f"CREATE DATABASE IF NOT EXISTS {dbname}"
+cursor.execute(sql)
+
+# Use database
+cursor.execute(f"USE {dbname}")
+
+# Create tables
+tables = {
+    "engine": """
+        CREATE TABLE IF NOT EXISTS engine (
+            id INT AUTO_INCREMENT PRIMARY KEY,
+            name VARCHAR(200),
+            engineid VARCHAR(100) NOT NULL
+        )
+    """,
+    "game": """
+        CREATE TABLE IF NOT EXISTS game (
+            id INT AUTO_INCREMENT PRIMARY KEY,
+            name VARCHAR(200),
+            engine INT NOT NULL,
+            gameid VARCHAR(100) NOT NULL,
+            extra VARCHAR(200),
+            platform VARCHAR(30),
+            language VARCHAR(10),
+            FOREIGN KEY (engine) REFERENCES engine(id)
+        )
+    """,
+    "file": """
+        CREATE TABLE IF NOT EXISTS file (
+            id INT AUTO_INCREMENT PRIMARY KEY,
+            name VARCHAR(200) NOT NULL,
+            size BIGINT NOT NULL,
+            checksum VARCHAR(64) NOT NULL,
+            fileset INT NOT NULL,
+            detection BOOLEAN NOT NULL,
+            FOREIGN KEY (fileset) REFERENCES fileset(id) ON DELETE CASCADE
+        )
+    """,
+    "filechecksum": """
+        CREATE TABLE IF NOT EXISTS filechecksum (
+            id INT AUTO_INCREMENT PRIMARY KEY,
+            file INT NOT NULL,
+            checksize VARCHAR(10) NOT NULL,
+            checktype VARCHAR(10) NOT NULL,
+            checksum VARCHAR(64) NOT NULL,
+            FOREIGN KEY (file) REFERENCES file(id) ON DELETE CASCADE
+        )
+    """,
+    "queue": """
+        CREATE TABLE IF NOT EXISTS queue (
+            id INT AUTO_INCREMENT PRIMARY KEY,
+            time TIMESTAMP NOT NULL,
+            notes varchar(300),
+            fileset INT,
+            userid INT NOT NULL,
+            commit VARCHAR(64) NOT NULL,
+            FOREIGN KEY (fileset) REFERENCES fileset(id)
+        )
+    """,
+    "fileset": """
+        CREATE TABLE IF NOT EXISTS fileset (
+            id INT AUTO_INCREMENT PRIMARY KEY,
+            game INT,
+            status VARCHAR(20),
+            src VARCHAR(20),
+            `key` VARCHAR(64),
+            `megakey` VARCHAR(64),
+            `delete` BOOLEAN DEFAULT FALSE NOT NULL,
+            `timestamp` TIMESTAMP NOT NULL,
+            detection_size INT,
+            FOREIGN KEY (game) REFERENCES game(id)
+        )
+    """,
+    "log": """
+        CREATE TABLE IF NOT EXISTS log (
+            id INT AUTO_INCREMENT PRIMARY KEY,
+            `timestamp` TIMESTAMP NOT NULL,
+            category VARCHAR(100) NOT NULL,
+            user VARCHAR(100) NOT NULL,
+            `text` varchar(300)
+        )
+    """,
+    "history": """
+        CREATE TABLE IF NOT EXISTS history (
+            id INT AUTO_INCREMENT PRIMARY KEY,
+            `timestamp` TIMESTAMP NOT NULL,
+            fileset INT NOT NULL,
+            oldfileset INT NOT NULL,
+            log INT
+        )
+    """,
+    "transactions": """
+        CREATE TABLE IF NOT EXISTS transactions (
+            id INT AUTO_INCREMENT PRIMARY KEY,
+            `transaction` INT NOT NULL,
+            fileset INT NOT NULL
+        )
+    """
+}
+
+for table, definition in tables.items():
+    try:
+        cursor.execute(definition)
+        print(f"Table '{table}' created successfully")
+    except pymysql.Error as err:
+        print(f"Error creating '{table}' table: {err}")
+
+# Create indices
+indices = {
+    "detection": "CREATE INDEX detection ON file (detection)",
+    "checksum": "CREATE INDEX checksum ON filechecksum (checksum)",
+    "engineid": "CREATE INDEX engineid ON engine (engineid)",
+    "key": "CREATE INDEX fileset_key ON fileset (`key`)",
+    "status": "CREATE INDEX status ON fileset (status)",
+    "fileset": "CREATE INDEX fileset ON history (fileset)"
+}
+
+for index, definition in indices.items():
+    try:
+        cursor.execute(definition)
+        print(f"Created index for '{index}'")
+    except pymysql.Error as err:
+        print(f"Error creating index for '{index}': {err}")
+
+# Insert random data into tables
+def random_string(length=10):
+    return ''.join(random.choices(string.ascii_letters + string.digits, k=length))
+
+def insert_random_data():
+    # Insert data into engine
+    cursor.execute("INSERT INTO engine (name, engineid) VALUES (%s, %s)", (random_string(), random_string()))
+    cursor.execute("INSERT INTO engine (name, engineid) VALUES (%s, %s)", (random_string(), random_string()))
+    
+    # Insert data into game
+    cursor.execute("INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES (%s, %s, %s, %s, %s, %s)", 
+                   (random_string(), 1, random_string(), random_string(), random_string(), random_string()))
+    cursor.execute("INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES (%s, %s, %s, %s, %s, %s)", 
+                   (random_string(), 2, random_string(), random_string(), random_string(), random_string()))
+    
+    # Insert data into fileset
+    cursor.execute("INSERT INTO fileset (game, status, src, `key`, `megakey`, `timestamp`, detection_size) VALUES (%s, %s, %s, %s, %s, %s, %s)", 
+                   (1, random_string(), random_string(), random_string(), random_string(), datetime.now(), random.randint(1, 100)))
+    cursor.execute("INSERT INTO fileset (game, status, src, `key`, `megakey`, `timestamp`, detection_size) VALUES (%s, %s, %s, %s, %s, %s, %s)", 
+                   (2, random_string(), random_string(), random_string(), random_string(), datetime.now(), random.randint(1, 100)))
+    
+    # Insert data into file
+    cursor.execute("INSERT INTO file (name, size, checksum, fileset, detection) VALUES (%s, %s, %s, %s, %s)", 
+                   (random_string(), random.randint(1000, 10000), random_string(), 1, True))
+    cursor.execute("INSERT INTO file (name, size, checksum, fileset, detection) VALUES (%s, %s, %s, %s, %s)", 
+                   (random_string(), random.randint(1000, 10000), random_string(), 2, False))
+    
+    # Insert data into filechecksum
+    cursor.execute("INSERT INTO filechecksum (file, checksize, checktype, checksum) VALUES (%s, %s, %s, %s)", 
+                   (1, random_string(), random_string(), random_string()))
+    cursor.execute("INSERT INTO filechecksum (file, checksize, checktype, checksum) VALUES (%s, %s, %s, %s)", 
+                   (2, random_string(), random_string(), random_string()))
+    
+    # Insert data into queue
+    cursor.execute("INSERT INTO queue (time, notes, fileset, userid, commit) VALUES (%s, %s, %s, %s, %s)", 
+                   (datetime.now(), random_string(), 1, random.randint(1, 100), random_string()))
+    cursor.execute("INSERT INTO queue (time, notes, fileset, userid, commit) VALUES (%s, %s, %s, %s, %s)", 
+                   (datetime.now(), random_string(), 2, random.randint(1, 100), random_string()))
+    
+    # Insert data into log
+    cursor.execute("INSERT INTO log (`timestamp`, category, user, `text`) VALUES (%s, %s, %s, %s)", 
+                   (datetime.now(), random_string(), random_string(), random_string()))
+    cursor.execute("INSERT INTO log (`timestamp`, category, user, `text`) VALUES (%s, %s, %s, %s)", 
+                   (datetime.now(), random_string(), random_string(), random_string()))
+    
+    # Insert data into history
+    cursor.execute("INSERT INTO history (`timestamp`, fileset, oldfileset, log) VALUES (%s, %s, %s, %s)", 
+                   (datetime.now(), 1, 2, 1))
+    cursor.execute("INSERT INTO history (`timestamp`, fileset, oldfileset, log) VALUES (%s, %s, %s, %s)", 
+                   (datetime.now(), 2, 1, 2))
+    
+    # Insert data into transactions
+    cursor.execute("INSERT INTO transactions (`transaction`, fileset) VALUES (%s, %s)", 
+                   (random.randint(1, 100), 1))
+    cursor.execute("INSERT INTO transactions (`transaction`, fileset) VALUES (%s, %s)", 
+                   (random.randint(1, 100), 2))
+# for testing locally
+# insert_random_data()
+
+conn.commit()
+conn.close()
\ No newline at end of file


Commit: 47edb3085f5c4575ab0f092d0ca3224fb8c5e0ff
    https://github.com/scummvm/scummvm-sites/commit/47edb3085f5c4575ab0f092d0ca3224fb8c5e0ff
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:40:20+08:00

Commit Message:
INTEGRITY: Add validate page in fileset.py

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 49d3bc5..e8a97fd 100644
--- a/fileset.py
+++ b/fileset.py
@@ -1,4 +1,4 @@
-from flask import Flask, request, render_template, redirect, url_for, render_template_string
+from flask import Flask, request, render_template, redirect, url_for, render_template_string, jsonify
 import pymysql.cursors
 import json
 import re
@@ -178,5 +178,45 @@ def fileset():
     finally:
         connection.close()
 
+ at app.route('validate', methods=['POST'])
+def validate():
+
+    error_codes = {
+        "unknown": -1,
+        "success": 0,
+        "empty": 2,
+        "no_metadata": 3,
+    }
+
+    json_object = request.get_json()
+
+    ip = request.remote_addr
+    ip = '.'.join(ip.split('.')[:3]) + '.X'
+
+    game_metadata = {k: v for k, v in json_object.items() if k != 'files'}
+
+    json_response = {
+        'error': error_codes['success'],
+        'files': []
+    }
+
+    if not game_metadata:
+        if not json_object.get('files'):
+            json_response['error'] = error_codes['empty']
+            del json_response['files']
+            json_response['status'] = 'empty_fileset'
+            return jsonify(json_response)
+
+        json_response['error'] = error_codes['no_metadata']
+        del json_response['files']
+        json_response['status'] = 'no_metadata'
+
+        fileset_id = user_insert_fileset(json_object['files'], ip, conn)
+        json_response['fileset'] = fileset_id
+        # TODO: handle database operations
+
+        return jsonify(json_response)
+    
+
 if __name__ == '__main__':
     app.run()
\ No newline at end of file


Commit: 7aa7b1f61cf14be12d286988b3028b2b5109e84f
    https://github.com/scummvm/scummvm-sites/commit/7aa7b1f61cf14be12d286988b3028b2b5109e84f
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:40:22+08:00

Commit Message:
INTEGRITY: Add CSS file in static folder

Changed paths:
  A static/style.css
    fileset.py


diff --git a/fileset.py b/fileset.py
index e8a97fd..e0a0f06 100644
--- a/fileset.py
+++ b/fileset.py
@@ -64,7 +64,17 @@ def fileset():
             history = cursor.fetchall()
 
             # Display fileset details
-            html = f"<h2><u>Fileset: {id}</u></h2>"
+            html = f"""
+        <!DOCTYPE html>
+        <html>
+        <head>
+            <link rel="stylesheet" type="text/css" href="{{{{ url_for('static', filename='style.css') }}}}">
+        </head>
+        <body>
+        <h2><u>Fileset: {id}</u></h2>
+        <h3>Fileset details</h3>
+        <table>
+        """
 
             cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
             result = cursor.fetchone()
@@ -178,7 +188,7 @@ def fileset():
     finally:
         connection.close()
 
- at app.route('validate', methods=['POST'])
+ at app.route('/validate', methods=['POST'])
 def validate():
 
     error_codes = {
diff --git a/static/style.css b/static/style.css
new file mode 100644
index 0000000..1c9e599
--- /dev/null
+++ b/static/style.css
@@ -0,0 +1,113 @@
+:root {
+  --primary-color: #27b5e8;
+  font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
+}
+
+td, th {
+  padding-inline: 5px;
+}
+
+tr:nth-child(even) {background-color: #f2f2f2;}
+tr {background-color: white;}
+
+tr:hover {background-color: #ddd;}
+tr.games_list:hover {cursor: pointer;}
+
+tr.filter:hover {background-color:inherit;}
+td.filter {text-align: center;}
+
+th {
+  padding-top: 5px;
+  padding-bottom: 5px;
+  text-align: center;
+  background-color: var(--primary-color);
+  color: white;
+}
+
+th a {
+  color: white;
+  text-decoration: none; /* no underline */
+}
+
+button {
+  color: white;
+  padding: 6px 12px;
+  border-radius: 10px;
+  transition: background-color 0.1s;
+  background-color: var(--primary-color);
+  border: 1px solid var(--primary-color);
+}
+
+button:hover {
+  background-color: #29afe0;
+}
+button:active {
+  background-color: #1a95c2;
+}
+
+input[type=submit] {
+  color: white;
+  padding: 6px 12px;
+  border-radius: 10px;
+  transition: background-color 0.1s;
+  background-color: var(--primary-color);
+  border: 1px solid var(--primary-color);
+}
+
+input[type=submit]:hover {
+  background-color: #29afe0;
+}
+input[type=submit]:active {
+  background-color: #1a95c2;
+}
+
+input[type=text], select {
+  width: 25%;
+  height: 38px;
+  padding: 6px 12px;
+  margin: 0px 8px;
+  display: inline-block;
+  border: 1px solid #ccc;
+  border-radius: 4px;
+  box-sizing: border-box;
+}
+
+input[type=text].filter {
+  width: 80%;
+}
+
+.pagination {
+  display: inline-block;
+  align-self: center;
+}
+
+.pagination .more {
+  color: black;
+  float: left;
+  padding: 15px 10px;
+}
+
+.pagination a {
+  color: black;
+  float: left;
+  padding: 8px 16px;
+  text-decoration: none;
+  transition: background-color 0.3s;
+  border: 1px solid #ddd;
+}
+
+.pagination a.active {
+  color: white;
+  background-color: var(--primary-color);
+  border: 1px solid var(--primary-color);
+}
+
+.pagination a:hover:not(.active) {
+  background-color: #ddd;
+}
+
+form {
+  padding: 0px;
+  margin: 0px;
+  display: inline;
+}


Commit: e075d66cc792b8fc5aedd408acc61a8a9fdf1894
    https://github.com/scummvm/scummvm-sites/commit/e075d66cc792b8fc5aedd408acc61a8a9fdf1894
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:41:04+08:00

Commit Message:
INTEGRITY: Fix the rendering of pagination.py

Changed paths:
    fileset.py
    pagination.py


diff --git a/fileset.py b/fileset.py
index e0a0f06..86e25ed 100644
--- a/fileset.py
+++ b/fileset.py
@@ -4,6 +4,7 @@ import json
 import re
 import os
 from user_fileset_functions import user_calc_key, file_json_to_array, user_insert_queue, user_insert_fileset, match_and_merge_user_filesets
+from pagination import create_page
 
 app = Flask(__name__)
 
@@ -227,6 +228,61 @@ def validate():
 
         return jsonify(json_response)
     
+ at app.route('/user_games_list')
+def user_games_list():
+    filename = "user_games_list.php"
+    records_table = "fileset"
+    select_query = """
+    SELECT engineid, gameid, extra, platform, language, game.name,
+    status, fileset.id as fileset
+    FROM fileset
+    LEFT JOIN game ON game.id = fileset.game
+    LEFT JOIN engine ON engine.id = game.engine
+    WHERE status = 'user'
+    """
+    order = "ORDER BY gameid"
+    filters = {
+        "engineid": "engine",
+        "gameid": "game",
+        "extra": "game",
+        "platform": "game",
+        "language": "game",
+        "name": "game",
+        "status": "fileset"
+    }
+    mapping = {
+        'engine.id': 'game.engine',
+        'game.id': 'fileset.game',
+    }
+    return render_template_string(create_page(filename, 200, records_table, select_query, order, filters, mapping))
+
+
+ at app.route('/games_list')
+def games_list():
+    filename = "games_list"
+    records_table = "game"
+    select_query = """
+    SELECT engineid, gameid, extra, platform, language, game.name,
+    status, fileset.id as fileset
+    FROM game
+    JOIN engine ON engine.id = game.engine
+    JOIN fileset ON game.id = fileset.game
+    """
+    order = "ORDER BY gameid"
+    filters = {
+        "engineid": "engine",
+        "gameid": "game",
+        "extra": "game",
+        "platform": "game",
+        "language": "game",
+        "name": "game",
+        'status': 'fileset'
+    }
+    mapping = {
+        'engine.id': 'game.engine',
+        'game.id': 'fileset.game',
+    }
+    return render_template_string(create_page(filename, 25, records_table, select_query, order, filters, mapping))
 
 if __name__ == '__main__':
     app.run()
\ No newline at end of file
diff --git a/pagination.py b/pagination.py
index f5adcb0..d8e1db9 100644
--- a/pagination.py
+++ b/pagination.py
@@ -1,33 +1,14 @@
-from urllib.parse import urlencode
+from flask import Flask, request, render_template_string
 import pymysql
 import json
-from flask import Flask, request, render_template_string
-import os
 import re
-from math import ceil
-import html
+import os
+
+app = Flask(__name__)
 
 stylesheet = 'style.css'
 jquery_file = 'https://code.jquery.com/jquery-3.7.0.min.js'
 js_file = 'js_functions.js'
-print(f"<link rel='stylesheet' href='{stylesheet}'>\n")
-print(f"<script type='text/javascript' src='{jquery_file}'></script>\n")
-print(f"<script type='text/javascript' src='{js_file}'></script>\n")
-
-
-with open('mysql_config.json') as f:
-    mysql_cred = json.load(f)
-
-conn = pymysql.connect(
-    host=mysql_cred["servername"],
-    user=mysql_cred["username"],
-    password=mysql_cred["password"],
-    db=mysql_cred["dbname"],
-    charset='utf8mb4',
-    cursorclass=pymysql.cursors.DictCursor,
-    autocommit=False
-)
-
 
 def get_join_columns(table1, table2, mapping):
     for primary, foreign in mapping.items():
@@ -35,204 +16,174 @@ def get_join_columns(table1, table2, mapping):
         foreign = foreign.split('.')
         if (primary[0] == table1 and foreign[0] == table2) or (primary[0] == table2 and foreign[0] == table1):
             return f"{primary[0]}.{primary[1]} = {foreign[0]}.{foreign[1]}"
-    raise ValueError("No primary-foreign key mapping provided. Filter is invalid")
+    return "No primary-foreign key mapping provided. Filter is invalid"
 
-def create_page(filename, results_per_page, records_table, select_query, order, filters = {}, mapping = {}):
-    with open(os.path.join(os.path.dirname(__file__), '../mysql_config.json')) as f:
+def create_page(filename, results_per_page, records_table, select_query, order, filters={}, mapping={}):
+    with open(os.path.join(os.path.dirname(__file__), 'mysql_config.json')) as f:
         mysql_cred = json.load(f)
-
+    
     conn = pymysql.connect(
         host=mysql_cred["servername"],
         user=mysql_cred["username"],
         password=mysql_cred["password"],
         db=mysql_cred["dbname"],
         charset='utf8mb4',
-        cursorclass=pymysql.cursors.DictCursor,
-        autocommit=False
+        cursorclass=pymysql.cursors.DictCursor
     )
 
-    # Check connection
-    if not conn.open:
-        print("Connect failed.")
-        return
-
-    # If there exist get variables that are for filtering
-    get = {k: v for k, v in request.args.items() if v != ''}
-    if 'sort' in get:
-        column = get['sort'].split('-')
-        order = "ORDER BY {}".format(column[0])
-
-        if 'desc' in get['sort']:
-            order += " DESC"
-
-    if set(get.keys()) - set(['page', 'sort']):
-        condition = "WHERE "
-        tables = []
-        for key, value in get.items():
-            if key in ['page', 'sort'] or value == '':
-                continue
-
-            tables.append(filters[key])
-            condition += " AND {}.{} REGEXP '{}'".format(filters[key], key, value) if condition != "WHERE " else "{}.{} REGEXP '{}'".format(filters[key], key, value)
-        if condition == "WHERE ":
-            condition = ""
-
-        # If more than one table is to be searched
-        from_query = records_table
-        if len(tables) > 1 or tables[0] != records_table:
-            for i in range(len(tables)):
-                if tables[i] == records_table:
+    with conn.cursor() as cursor:
+        # Handle sorting
+        sort = request.args.get('sort')
+        if sort:
+            column = sort.split('-')
+            order = f"ORDER BY {column[0]}"
+            if 'desc' in sort:
+                order += " DESC"
+        
+        if set(request.args.keys()).difference({'page', 'sort'}):
+            condition = "WHERE "
+            tables = []
+            for key, value in request.args.items():
+                if key in ['page', 'sort'] or value == '':
                     continue
-
-                from_query += " JOIN {} ON {}".format(tables[i], get_join_columns(records_table, tables[i], mapping))
-
-        cursor = conn.cursor()
-        cursor.execute("SELECT COUNT({}.id) FROM {} {}".format(records_table, from_query, condition))
-        num_of_results = cursor.fetchone()[0]
-    # If $records_table has a JOIN (multiple tables)
-    elif re.search("JOIN", records_table):
-        first_table = records_table.split(" ")[0]
-        cursor = conn.cursor()
-        cursor.execute("SELECT COUNT({}.id) FROM {}".format(first_table, records_table))
-        num_of_results = cursor.fetchone()[0]
-    else:
-        cursor = conn.cursor()
-        cursor.execute("SELECT COUNT(id) FROM {}".format(records_table))
-        num_of_results = cursor.fetchone()[0]
-    num_of_pages = ceil(num_of_results / results_per_page)
-    if num_of_results == 0:
-        print("No results for given filters")
-        return
-
-    if 'page' not in get:
-        page = 1
-    else:
-        page = max(1, min(int(get['page']), num_of_pages))
-
-    offset = (page - 1) * results_per_page
-
-    # If there exist get variables that are for filtering
-    if set(get.keys()) - set(['page']):
-        condition = "WHERE "
-        for key, value in get.items():
-            value = conn.converter.escape(value)
-            if key not in filters:
-                continue
-
-            condition += "AND {}.{} REGEXP '{}'".format(filters[key], key, value) if condition != "WHERE " else "{}.{} REGEXP '{}'".format(filters[key], key, value)
-        if condition == "WHERE ":
-            condition = ""
-
-        query = "{} {} {} LIMIT {} OFFSET {}".format(select_query, condition, order, results_per_page, offset)
-    else:
-        query = "{} {} LIMIT {} OFFSET {}".format(select_query, order, results_per_page, offset)
-    cursor = conn.cursor()
-    cursor.execute(query)
-
-    # Table
-    print("<form id='filters-form' method='GET' onsubmit='remove_empty_inputs()'>")
-    print("<table>")
-
-    counter = offset + 1
-    for row in cursor.fetchall():
-        if counter == offset + 1: # If it is the first run of the loop
-            if len(filters) > 0:
-                print("<tr class=filter><td></td>")
-                for key in row.keys():
-                    if key not in filters:
-                        print("<td class=filter />")
+                tables.append(filters[key])
+                if value == '':
+                    value = '.*'
+                condition += f" AND {filters[key]}.{key} REGEXP '{value}'" if condition != "WHERE " else f"{filters[key]}.{key} REGEXP '{value}'"
+
+            if condition == "WHERE ":
+                condition = ""
+
+            # Handle multiple tables
+            from_query = records_table
+            if len(tables) > 1 or (tables and tables[0] != records_table):
+                for table in tables:
+                    if table == records_table:
                         continue
-
-                    # Filter textbox
-                    filter_value = get[key] if key in get else ""
-
-                    print("<td class=filter><input type=text class=filter placeholder='{}' name='{}' value='{}'/></td>".format(key, key, filter_value))
-                print("</tr>")
-                print("<tr class=filter><td></td><td class=filter><input type=submit value='Submit'></td></tr>")
-
-            print("<th/>") # Numbering column
-            for key in row.keys():
-                if key == 'fileset':
+                    from_query += f" JOIN {table} ON {get_join_columns(records_table, table, mapping)}"
+
+            cursor.execute(f"SELECT COUNT({records_table}.id) AS count FROM {records_table}")
+            num_of_results = cursor.fetchone()['count']
+            
+        elif "JOIN" in records_table:
+            first_table = records_table.split(" ")[0]
+            cursor.execute(f"SELECT COUNT({first_table}.id) FROM {records_table}")
+            num_of_results = cursor.fetchone()[f'COUNT({first_table}.id)']
+        else:
+            cursor.execute(f"SELECT COUNT(id) FROM {records_table}")
+            num_of_results = cursor.fetchone()['COUNT(id)']
+            
+        num_of_pages = (num_of_results + results_per_page - 1) // results_per_page
+
+        if num_of_results == 0:
+            return "No results for given filters"
+
+        page = int(request.args.get('page', 1))
+        page = max(1, min(page, num_of_pages))
+        offset = (page - 1) * results_per_page
+
+        # Fetch results
+        if set(request.args.keys()).difference({'page'}):
+            condition = "WHERE "
+            for key, value in request.args.items():
+                if key not in filters:
                     continue
 
-                # Preserve GET variables
-                vars = ""
-                for k, v in get.items():
-                    if k == 'sort' and v == key:
-                        vars += "&{}={}-desc".format(k, v)
-                    elif k != 'sort':
-                        vars += "&{}={}".format(k, v)
+                value = pymysql.converters.escape_string(value)
+                if value == '':
+                    value = '.*'
+                condition += f" AND {filters[key]}.{key} REGEXP '{value}'" if condition != "WHERE " else f"{filters[key]}.{key} REGEXP '{value}'"
 
-                if "&sort={}".format(key) not in vars:
-                    print("<th><a href='{}?{}&sort={}'>{}</th>".format(filename, vars, key, key))
-                else:
-                    print("<th><a href='{}?{}'>{}</th>".format(filename, vars, key))
+            if condition == "WHERE ":
+                condition = ""
 
-        if filename in ['games_list.php', 'user_games_list.php']:
-            print("<tr class=games_list onclick='hyperlink(\"fileset.php?id={}\")'>".format(row['fileset']))
+            query = f"{select_query} {condition} {order} LIMIT {results_per_page} OFFSET {offset}"
         else:
-            print("<tr>")
-        print("<td>{}.</td>".format(counter))
-        for key, value in row.items():
+            query = f"{select_query} {order} LIMIT {results_per_page} OFFSET {offset}"
+        cursor.execute(query)
+        results = cursor.fetchall()
+
+    # Generate HTML
+    html = f"""
+    <!DOCTYPE html>
+        <html>
+        <head>
+            <link rel="stylesheet" type="text/css" href="{{{{ url_for('static', filename='style.css') }}}}">
+        </head>
+        <body>
+    <form id='filters-form' method='GET' onsubmit='remove_empty_inputs()'>
+    <table>
+    """
+    if not results:
+        return "No results for given filters"
+    if results:
+        if filters:
+            html += "<tr class='filter'><td></td>"
+            for key in results[0].keys():
+                if key not in filters:
+                    html += "<td class='filter'></td>"
+                    continue
+                filter_value = request.args.get(key, "")
+                html += f"<td class='filter'><input type='text' class='filter' placeholder='{key}' name='{key}' value='{filter_value}'/></td>"
+            html += "</tr><tr class='filter'><td></td><td class='filter'><input type='submit' value='Submit'></td></tr>"
+
+        html += "<th></th>"
+        for key in results[0].keys():
             if key == 'fileset':
                 continue
+            vars = "&".join([f"{k}={v}" for k, v in request.args.items() if k != 'sort'])
+            if f"&sort={key}" not in vars:
+                html += f"<th><a href='{filename}?{vars}&sort={key}'>{key}</a></th>"
+            else:
+                html += f"<th><a href='{filename}?{vars}'>{key}</a></th>"
+
+        counter = offset + 1
+        for row in results:
+            if filename in ['games_list.php', 'user_games_list.php']:
+                html += f"<tr class='games_list' onclick='hyperlink(\"fileset.php?id={row['fileset']}\")'>"
+            else:
+                html += "<tr>"
+            html += f"<td>{counter}.</td>"
+            for key, value in row.items():
+                if key == 'fileset':
+                    continue
+                matches = re.search(r"Fileset:(\d+)", value)
+                if matches:
+                    value = re.sub(r"Fileset:(\d+)", f"<a href='fileset.php?id={matches.group(1)}'>Fileset:{matches.group(1)}</a>", value)
+                html += f"<td>{value}</td>"
+            html += "</tr>"
+            counter += 1
 
-            # Add links to fileset in logs table
-            matches = re.search("Fileset:(\d+)", value)
-            if matches:
-                value = value[:matches.start()] + "<a href='fileset.php?id={}'>{}</a>".format(matches.group(1), matches.group(0)) + value[matches.end():]
-
-            print("<td>{}</td>".format(value))
-        print("</tr>")
-
-        counter += 1
-
-    print("</table>")
-    print("</form>")
+    html += "</table></form>"
 
-    # Preserve GET variables
-    vars = ""
-    for key, value in get.items():
-        if key == 'page':
-            continue
-        vars += "&{}={}".format(key, value)
+    # Pagination
+    vars = "&".join([f"{k}={v}" for k, v in request.args.items() if k != 'page'])
 
-    # Navigation elements
     if num_of_pages > 1:
-        print("<form method='GET'>")
-
-        # Preserve GET variables on form submit
-        for key, value in get.items():
-            if key == 'page':
-                continue
-
-            key = html.escape(key)
-            value = html.escape(value)
-            if value != "":
-                print("<input type='hidden' name='{}' value='{}'>".format(key, value))
-
-        print("<div class=pagination>")
+        html += "<form method='GET'>"
+        for key, value in request.args.items():
+            if key != 'page':
+                html += f"<input type='hidden' name='{key}' value='{value}'>"
+        html += "<div class='pagination'>"
         if page > 1:
-            print("<a href={}{}>❮❮</a>".format(filename, vars))
-            print("<a href={}page={}{}>❮</a>".format(filename, page - 1, vars))
+            html += f"<a href='{filename}?{vars}'>❮❮</a>"
+            html += f"<a href='{filename}?page={page-1}&{vars}'>❮</a>"
         if page - 2 > 1:
-            print("<div class=more>...</div>")
-
+            html += "<div class='more'>...</div>"
         for i in range(page - 2, page + 3):
-            if i >= 1 and i <= num_of_pages:
+            if 1 <= i <= num_of_pages:
                 if i == page:
-                    print("<a class=active href={}page={}{}>{}</a>".format(filename, i, vars, i))
+                    html += f"<a class='active' href='{filename}?page={i}&{vars}'>{i}</a>"
                 else:
-                    print("<a href={}page={}{}>{}</a>".format(filename, i, vars, i))
-
+                    html += f"<a href='{filename}?page={i}&{vars}'>{i}</a>"
         if page + 2 < num_of_pages:
-            print("<div class=more>...</div>")
+            html += "<div class='more'>...</div>"
         if page < num_of_pages:
-            print("<a href={}page={}{}>❯</a>".format(filename, page + 1, vars))
-            print("<a href={}page={}{}>❯❯</a>".format(filename, num_of_pages, vars))
-
-        print("<input type='text' name='page' placeholder='Page No'>")
-        print("<input type='submit' value='Submit'>")
-        print("</div>")
-
-        print("</form>")
+            html += f"<a href='{filename}?page={page+1}&{vars}'>❯</a>"
+            html += f"<a href='{filename}?page={num_of_pages}&{vars}'>❯❯</a>"
+        html += "<input type='text' name='page' placeholder='Page No'>"
+        html += "<input type='submit' value='Submit'>"
+        html += "</div></form>"
+
+    return html
+    
\ No newline at end of file


Commit: 46928b4fb399aa3e6ce4ee9443b0b799f6fd9423
    https://github.com/scummvm/scummvm-sites/commit/46928b4fb399aa3e6ce4ee9443b0b799f6fd9423
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:41:04+08:00

Commit Message:
INTEGRITY: Add page turning in pagination.py

Changed paths:
    fileset.py
    pagination.py
    schema.py


diff --git a/fileset.py b/fileset.py
index 86e25ed..fd8b954 100644
--- a/fileset.py
+++ b/fileset.py
@@ -182,7 +182,7 @@ def fileset():
                     html += f"<td>{log['timestamp']}</td>\n"
                     html += f"<td>{log['category']}</td>\n"
                     html += f"<td>{log['text']}</td>\n"
-                    html += f"<td><a href='logs.php?id={log['id']}'>{log['id']}</a></td>\n"
+                    html += f"<td><a href='logs?id={log['id']}'>{log['id']}</a></td>\n"
                     html += "</tr>\n"
             html += "</table>\n"
             return render_template_string(html)
@@ -230,7 +230,7 @@ def validate():
     
 @app.route('/user_games_list')
 def user_games_list():
-    filename = "user_games_list.php"
+    filename = "user_games_list"
     records_table = "fileset"
     select_query = """
     SELECT engineid, gameid, extra, platform, language, game.name,
@@ -264,7 +264,7 @@ def games_list():
     select_query = """
     SELECT engineid, gameid, extra, platform, language, game.name,
     status, fileset.id as fileset
-    FROM game
+    FROM fileset
     JOIN engine ON engine.id = game.engine
     JOIN fileset ON game.id = fileset.game
     """
diff --git a/pagination.py b/pagination.py
index d8e1db9..20b7004 100644
--- a/pagination.py
+++ b/pagination.py
@@ -139,8 +139,8 @@ def create_page(filename, results_per_page, records_table, select_query, order,
 
         counter = offset + 1
         for row in results:
-            if filename in ['games_list.php', 'user_games_list.php']:
-                html += f"<tr class='games_list' onclick='hyperlink(\"fileset.php?id={row['fileset']}\")'>"
+            if filename in ['games_list', 'user_games_list']:
+                html += f"<tr class='games_list' onclick='hyperlink(\"fileset?id={row['fileset']}\")'>"
             else:
                 html += "<tr>"
             html += f"<td>{counter}.</td>"
@@ -149,7 +149,7 @@ def create_page(filename, results_per_page, records_table, select_query, order,
                     continue
                 matches = re.search(r"Fileset:(\d+)", value)
                 if matches:
-                    value = re.sub(r"Fileset:(\d+)", f"<a href='fileset.php?id={matches.group(1)}'>Fileset:{matches.group(1)}</a>", value)
+                    value = re.sub(r"Fileset:(\d+)", f"<a href='fileset?id={matches.group(1)}'>Fileset:{matches.group(1)}</a>", value)
                 html += f"<td>{value}</td>"
             html += "</tr>"
             counter += 1
diff --git a/schema.py b/schema.py
index dee08f4..5a85d55 100644
--- a/schema.py
+++ b/schema.py
@@ -160,59 +160,43 @@ def random_string(length=10):
     return ''.join(random.choices(string.ascii_letters + string.digits, k=length))
 
 def insert_random_data():
-    # Insert data into engine
-    cursor.execute("INSERT INTO engine (name, engineid) VALUES (%s, %s)", (random_string(), random_string()))
-    cursor.execute("INSERT INTO engine (name, engineid) VALUES (%s, %s)", (random_string(), random_string()))
-    
-    # Insert data into game
-    cursor.execute("INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES (%s, %s, %s, %s, %s, %s)", 
-                   (random_string(), 1, random_string(), random_string(), random_string(), random_string()))
-    cursor.execute("INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES (%s, %s, %s, %s, %s, %s)", 
-                   (random_string(), 2, random_string(), random_string(), random_string(), random_string()))
-    
-    # Insert data into fileset
-    cursor.execute("INSERT INTO fileset (game, status, src, `key`, `megakey`, `timestamp`, detection_size) VALUES (%s, %s, %s, %s, %s, %s, %s)", 
-                   (1, random_string(), random_string(), random_string(), random_string(), datetime.now(), random.randint(1, 100)))
-    cursor.execute("INSERT INTO fileset (game, status, src, `key`, `megakey`, `timestamp`, detection_size) VALUES (%s, %s, %s, %s, %s, %s, %s)", 
-                   (2, random_string(), random_string(), random_string(), random_string(), datetime.now(), random.randint(1, 100)))
-    
-    # Insert data into file
-    cursor.execute("INSERT INTO file (name, size, checksum, fileset, detection) VALUES (%s, %s, %s, %s, %s)", 
-                   (random_string(), random.randint(1000, 10000), random_string(), 1, True))
-    cursor.execute("INSERT INTO file (name, size, checksum, fileset, detection) VALUES (%s, %s, %s, %s, %s)", 
-                   (random_string(), random.randint(1000, 10000), random_string(), 2, False))
-    
-    # Insert data into filechecksum
-    cursor.execute("INSERT INTO filechecksum (file, checksize, checktype, checksum) VALUES (%s, %s, %s, %s)", 
-                   (1, random_string(), random_string(), random_string()))
-    cursor.execute("INSERT INTO filechecksum (file, checksize, checktype, checksum) VALUES (%s, %s, %s, %s)", 
-                   (2, random_string(), random_string(), random_string()))
-    
-    # Insert data into queue
-    cursor.execute("INSERT INTO queue (time, notes, fileset, userid, commit) VALUES (%s, %s, %s, %s, %s)", 
-                   (datetime.now(), random_string(), 1, random.randint(1, 100), random_string()))
-    cursor.execute("INSERT INTO queue (time, notes, fileset, userid, commit) VALUES (%s, %s, %s, %s, %s)", 
-                   (datetime.now(), random_string(), 2, random.randint(1, 100), random_string()))
-    
-    # Insert data into log
-    cursor.execute("INSERT INTO log (`timestamp`, category, user, `text`) VALUES (%s, %s, %s, %s)", 
-                   (datetime.now(), random_string(), random_string(), random_string()))
-    cursor.execute("INSERT INTO log (`timestamp`, category, user, `text`) VALUES (%s, %s, %s, %s)", 
-                   (datetime.now(), random_string(), random_string(), random_string()))
-    
-    # Insert data into history
-    cursor.execute("INSERT INTO history (`timestamp`, fileset, oldfileset, log) VALUES (%s, %s, %s, %s)", 
-                   (datetime.now(), 1, 2, 1))
-    cursor.execute("INSERT INTO history (`timestamp`, fileset, oldfileset, log) VALUES (%s, %s, %s, %s)", 
-                   (datetime.now(), 2, 1, 2))
-    
-    # Insert data into transactions
-    cursor.execute("INSERT INTO transactions (`transaction`, fileset) VALUES (%s, %s)", 
-                   (random.randint(1, 100), 1))
-    cursor.execute("INSERT INTO transactions (`transaction`, fileset) VALUES (%s, %s)", 
-                   (random.randint(1, 100), 2))
+    for _ in range(1000):
+        # Insert data into engine
+        cursor.execute("INSERT INTO engine (name, engineid) VALUES (%s, %s)", (random_string(), random_string()))
+        
+        # Insert data into game
+        cursor.execute("INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES (%s, %s, %s, %s, %s, %s)", 
+                       (random_string(), 1, random_string(), random_string(), random_string(), random_string()))
+        
+        # Insert data into fileset
+        cursor.execute("INSERT INTO fileset (game, status, src, `key`, `megakey`, `timestamp`, detection_size) VALUES (%s, %s, %s, %s, %s, %s, %s)", 
+                       (1, 'user', random_string(), random_string(), random_string(), datetime.now(), random.randint(1, 100)))
+        
+        # Insert data into file
+        cursor.execute("INSERT INTO file (name, size, checksum, fileset, detection) VALUES (%s, %s, %s, %s, %s)", 
+                       (random_string(), random.randint(1000, 10000), random_string(), 1, True))
+        
+        # Insert data into filechecksum
+        cursor.execute("INSERT INTO filechecksum (file, checksize, checktype, checksum) VALUES (%s, %s, %s, %s)", 
+                       (1, random_string(), random_string(), random_string()))
+        
+        # Insert data into queue
+        cursor.execute("INSERT INTO queue (time, notes, fileset, userid, commit) VALUES (%s, %s, %s, %s, %s)", 
+                       (datetime.now(), random_string(), 1, random.randint(1, 100), random_string()))
+        
+        # Insert data into log
+        cursor.execute("INSERT INTO log (`timestamp`, category, user, `text`) VALUES (%s, %s, %s, %s)", 
+                       (datetime.now(), random_string(), random_string(), random_string()))
+        
+        # Insert data into history
+        cursor.execute("INSERT INTO history (`timestamp`, fileset, oldfileset, log) VALUES (%s, %s, %s, %s)", 
+                       (datetime.now(), 1, 2, 1))
+        
+        # Insert data into transactions
+        cursor.execute("INSERT INTO transactions (`transaction`, fileset) VALUES (%s, %s)", 
+                       (random.randint(1, 100), 1))
 # for testing locally
-# insert_random_data()
+insert_random_data()
 
 conn.commit()
 conn.close()
\ No newline at end of file


Commit: 0097a8d34cce7f867c9a66ebe5d1a02b724e93e7
    https://github.com/scummvm/scummvm-sites/commit/0097a8d34cce7f867c9a66ebe5d1a02b724e93e7
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:41:04+08:00

Commit Message:
INTEGRITY: Add the dat loading function in dat_parser.py

Changed paths:
  A dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
new file mode 100644
index 0000000..4327175
--- /dev/null
+++ b/dat_parser.py
@@ -0,0 +1,120 @@
+import re
+import os
+import sys
+from db_functions import db_insert, populate_matching_games
+
+def remove_quotes(string):
+    # Remove quotes from value if they are present
+    if string[0] == "\"":
+        string = string[1:-1]
+
+    return string
+
+def map_checksum_data(content_string):
+    arr = {}
+    temp = re.findall(r'("[^"]*")|\S+', content_string)
+
+    for i in range(1, len(temp), 2):
+        if temp[i] == ')' or temp[i] in ['crc', 'sha1']:
+            continue
+
+        temp[i + 1] = remove_quotes(temp[i + 1])
+        if temp[i + 1] == ')':
+            temp[i + 1] = ""
+        arr[temp[i]] = temp[i + 1].replace("\\", "")
+
+    return arr
+
+def map_key_values(content_string, arr):
+    # Split by newline into different pairs
+    temp = content_string.splitlines()
+
+    # Add pairs to the dictionary if they are not parentheses
+    for pair in temp:
+        pair = pair.strip()
+        if pair == "(" or pair == ")":
+            continue
+        pair = list(map(str.strip, pair.split(None, 1)))
+        pair[1] = remove_quotes(pair[1])
+
+        # Handle duplicate keys (if the key is rom) and add values to a array instead
+        if pair[0] == "rom":
+            if pair[0] in arr:
+                arr[pair[0]].append(map_checksum_data(pair[1]))
+            else:
+                arr[pair[0]] = [map_checksum_data(pair[1])]
+        else:
+            arr[pair[0]] = pair[1].replace("\\", "")
+            
+    return arr
+            
+def match_outermost_brackets(input):
+    """
+    Parse DAT file and separate the contents each segment into an array
+    Segments are of the form `scummvm ( )`, `game ( )` etc.
+    """
+    matches = []
+    depth = 0
+    inside_quotes = False
+    cur_index = 0
+
+    for i in range(len(input)):
+        char = input[i]
+
+        if char == '(' and not inside_quotes:
+            if depth == 0:
+                cur_index = i
+            depth += 1
+        elif char == ')' and not inside_quotes:
+            depth -= 1
+            if depth == 0:
+                match = input[cur_index:i+1]
+                matches.append((match, cur_index))
+        elif char == '"' and input[i - 1] != '\\':
+            inside_quotes = not inside_quotes
+
+    return matches
+
+def parse_dat(dat_filepath):
+    """
+    Take DAT filepath as input and return parsed data in the form of
+    associated arrays
+    """
+    if not os.path.isfile(dat_filepath):
+        print("File not readable")
+        return
+
+    with open(dat_filepath, "r") as dat_file:
+        content = dat_file.read()
+
+    header = {}
+    game_data = []
+    resources = {}
+
+    matches = match_outermost_brackets(content)
+    if matches:
+        for data_segment in matches:
+            if "clrmamepro" in content[data_segment[1] - 11: data_segment[1]] or \
+                "scummvm" in content[data_segment[1] - 8: data_segment[1]]:
+                map_key_values(data_segment[0], header)
+            elif "game" in content[data_segment[1] - 5: data_segment[1]]:
+                temp = {}
+                map_key_values(data_segment[0], temp)
+                game_data.append(temp)
+            elif "resource" in content[data_segment[1] - 9: data_segment[1]]:
+                temp = {}
+                map_key_values(data_segment[0], temp)
+                resources[temp["name"]] = temp
+
+    return header, game_data, resources, dat_filepath
+
+# Process command line args
+if "--upload" in sys.argv:
+    index = sys.argv.index("--upload")
+    for filepath in sys.argv[index + 1:]:
+        if filepath == "--match":
+            continue
+        db_insert(parse_dat(filepath))
+
+if "--match" in sys.argv:
+    populate_matching_games()
\ No newline at end of file


Commit: 51eb58f36391a9dcd6464eea1e60091df24941a2
    https://github.com/scummvm/scummvm-sites/commit/51eb58f36391a9dcd6464eea1e60091df24941a2
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:41:04+08:00

Commit Message:
INTEGRITY: Add boundary check in dat_parser.py

Changed paths:
    dat_parser.py
    db_functions.py
    fileset.py


diff --git a/dat_parser.py b/dat_parser.py
index 4327175..a92d869 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -5,7 +5,7 @@ from db_functions import db_insert, populate_matching_games
 
 def remove_quotes(string):
     # Remove quotes from value if they are present
-    if string[0] == "\"":
+    if string and string[0] == "\"":
         string = string[1:-1]
 
     return string
@@ -15,13 +15,13 @@ def map_checksum_data(content_string):
     temp = re.findall(r'("[^"]*")|\S+', content_string)
 
     for i in range(1, len(temp), 2):
-        if temp[i] == ')' or temp[i] in ['crc', 'sha1']:
-            continue
-
-        temp[i + 1] = remove_quotes(temp[i + 1])
-        if temp[i + 1] == ')':
-            temp[i + 1] = ""
-        arr[temp[i]] = temp[i + 1].replace("\\", "")
+        if i+1 < len(temp):
+            if temp[i] == ')' or temp[i] in ['crc', 'sha1']:
+                continue
+            temp[i + 1] = remove_quotes(temp[i + 1])
+            if temp[i + 1] == ')':
+                temp[i + 1] = ""
+            arr[temp[i]] = temp[i + 1].replace("\\", "")
 
     return arr
 
diff --git a/db_functions.py b/db_functions.py
index 496d0c6..d1a3393 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -5,6 +5,7 @@ import getpass
 import time
 import hashlib
 import os
+from pymysql.converters import escape_string
 
 def db_connect():
     with open('mysql_config.json') as f:
@@ -56,12 +57,12 @@ def insert_game(engine_name, engineid, title, gameid, extra, platform, lang, con
     # Insert into table if not present
     if not exists:
         with conn.cursor() as cursor:
-            cursor.execute(f"INSERT INTO engine (name, engineid) VALUES ('{pymysql.escape_string(engine_name)}', '{engineid}')")
+            cursor.execute(f"INSERT INTO engine (name, engineid) VALUES ('{escape_string(engine_name)}', '{engineid}')")
             cursor.execute("SET @engine_last = LAST_INSERT_ID()")
 
     # Insert into game
     with conn.cursor() as cursor:
-        cursor.execute(f"INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES ('{pymysql.escape_string(title)}', @engine_last, '{gameid}', '{pymysql.escape_string(extra)}', '{platform}', '{lang}')")
+        cursor.execute(f"INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES ('{escape_string(title)}', @engine_last, '{gameid}', '{escape_string(extra)}', '{platform}', '{lang}')")
         cursor.execute("SET @game_last = LAST_INSERT_ID()")
 
 def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip=''):
@@ -84,7 +85,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         existing_entry = cursor.fetchone()
 
     if existing_entry is not None:
-        existing_entry = existing_entry[0]
+        existing_entry = existing_entry['id']
         with conn.cursor() as cursor:
             cursor.execute(f"SET @fileset_last = {existing_entry}")
 
@@ -94,7 +95,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
             log_text = f"Duplicate of Fileset:{existing_entry}, from user IP {ip}, {log_text}"
 
         user = f'cli:{getpass.getuser()}'
-        create_log(pymysql.escape_string(category_text), user, pymysql.escape_string(log_text))
+        create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
         if not detection:
             return False
@@ -114,14 +115,14 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
     category_text = f"Uploaded from {src}"
     with conn.cursor() as cursor:
         cursor.execute("SELECT @fileset_last")
-        fileset_last = cursor.fetchone()[0]
+        fileset_last = cursor.fetchone()['@fileset_last']
 
     log_text = f"Created Fileset:{fileset_last}, {log_text}"
     if src == 'user':
         log_text = f"Created Fileset:{fileset_last}, from user IP {ip}, {log_text}"
 
     user = f'cli:{getpass.getuser()}'
-    create_log(pymysql.escape_string(category_text), user, pymysql.escape_string(log_text))
+    create_log(escape_string(category_text), user, escape_string(log_text), conn)
     with conn.cursor() as cursor:
         cursor.execute(f"INSERT INTO transactions (`transaction`, fileset) VALUES ({transaction}, {fileset_last})")
 
@@ -139,7 +140,7 @@ def insert_file(file, detection, src, conn):
                 checksize, checktype, checksum = get_checksum_props(key, value)
                 break
 
-    query = f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{pymysql.escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection})"
+    query = f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection})"
     with conn.cursor() as cursor:
         cursor.execute(query)
 
@@ -167,11 +168,11 @@ def delete_filesets(conn):
 
 
 def create_log(category, user, text, conn):
-    query = f"INSERT INTO log (`timestamp`, category, user, `text`) VALUES (FROM_UNIXTIME({int(time.time())}), '{pymysql.escape_string(category)}', '{pymysql.escape_string(user)}', '{pymysql.escape_string(text)}')"
+    query = f"INSERT INTO log (`timestamp`, category, user, `text`) VALUES (FROM_UNIXTIME({int(time.time())}), '{escape_string(category)}', '{escape_string(user)}', '{escape_string(text)}')"
     with conn.cursor() as cursor:
         cursor.execute(query)
         cursor.execute("SELECT LAST_INSERT_ID()")
-        log_last = cursor.fetchone()[0]
+        log_last = cursor.fetchone()['LAST_INSERT_ID()']
 
     try:
         conn.commit()
@@ -224,13 +225,15 @@ def db_insert(data_arr):
 
     conn.cursor().execute(f"SET @fileset_time_last = {int(time.time())}")
 
-    transaction_id = conn.cursor().execute("SELECT MAX(`transaction`) FROM transactions").fetchone()[0] + 1
+    with conn.cursor() as cursor:
+        cursor.execute("SELECT MAX(`transaction`) FROM transactions")
+        transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
 
     category_text = f"Uploaded from {src}"
     log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Transaction: {transaction_id}"
 
     user = f'cli:{getpass.getuser()}'
-    create_log(pymysql.escape_string(conn, category_text), user, pymysql.escape_string(conn, log_text))
+    create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
     for fileset in game_data:
         if detection:
@@ -260,16 +263,17 @@ def db_insert(data_arr):
 
     if detection:
         conn.cursor().execute("UPDATE fileset SET status = 'obsolete' WHERE `timestamp` != FROM_UNIXTIME(@fileset_time_last) AND status = 'detection'")
-
-    fileset_insertion_count = conn.cursor().execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}").fetchone()[0]
+    cur = conn.cursor()
+    cur.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
+    fileset_insertion_count = cur.fetchone()['COUNT(fileset)']
     category_text = f"Uploaded from {src}"
     log_text = f"Completed loading DAT file, filename '{filepath}', size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
-
+    
     if not conn.commit():
         print("Inserting failed")
     else:
         user = f'cli:{getpass.getuser()}'
-        create_log(pymysql.escape_string(conn, category_text), user, pymysql.escape_string(conn, log_text))
+        create_log(escape_string(conn, category_text), user, escape_string(conn, log_text))
 
 def compare_filesets(id1, id2, conn):
     with conn.cursor() as cursor:
@@ -323,7 +327,7 @@ def find_matching_game(game_files):
     for key, value in Counter(matching_filesets).items():
         with conn.cursor() as cursor:
             cursor.execute(f"SELECT COUNT(file.id) FROM file JOIN fileset ON file.fileset = fileset.id WHERE fileset.id = '{key}'")
-            count_files_in_fileset = cursor.fetchone()[0]
+            count_files_in_fileset = cursor.fetchone()['COUNT(file.id)']
 
         # We use < instead of != since one file may have more than one entry in the fileset
         # We see this in Drascula English version, where one entry is duplicated
@@ -373,7 +377,7 @@ def merge_filesets(detection_id, dat_id):
 
         cursor.execute(f"INSERT INTO history (`timestamp`, fileset, oldfileset) VALUES (FROM_UNIXTIME({int(time.time())}), {dat_id}, {detection_id})")
         cursor.execute("SELECT LAST_INSERT_ID()")
-        history_last = cursor.fetchone()[0]
+        history_last = cursor.fetchone()['LAST_INSERT_ID()']
 
         cursor.execute(f"UPDATE history SET fileset = {dat_id} WHERE fileset = {detection_id}")
 
@@ -437,10 +441,10 @@ def populate_matching_games():
             user = f'cli:{getpass.getuser()}'
 
             # Merge log
-            create_log("Fileset merge", user, pymysql.escape_string(conn, f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0][0]}"))
+            create_log("Fileset merge", user, escape_string(conn, f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0][0]}"))
 
             # Matching log
-            log_last = create_log(pymysql.escape_string(conn, category_text), user, pymysql.escape_string(conn, log_text))
+            log_last = create_log(escape_string(conn, category_text), user, escape_string(conn, log_text))
 
             # Add log id to the history table
             cursor.execute(f"UPDATE history SET log = {log_last} WHERE id = {history_last}")
diff --git a/fileset.py b/fileset.py
index fd8b954..78966a6 100644
--- a/fileset.py
+++ b/fileset.py
@@ -264,7 +264,7 @@ def games_list():
     select_query = """
     SELECT engineid, gameid, extra, platform, language, game.name,
     status, fileset.id as fileset
-    FROM fileset
+    FROM game
     JOIN engine ON engine.id = game.engine
     JOIN fileset ON game.id = fileset.game
     """


Commit: 566914a80d506400a2932e3db9eaf0965baf0df9
    https://github.com/scummvm/scummvm-sites/commit/566914a80d506400a2932e3db9eaf0965baf0df9
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:41:04+08:00

Commit Message:
INTEGRITY: Add logs page in fileset.py

Changed paths:
    db_functions.py
    fileset.py
    pagination.py


diff --git a/db_functions.py b/db_functions.py
index d1a3393..586c342 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -268,12 +268,15 @@ def db_insert(data_arr):
     fileset_insertion_count = cur.fetchone()['COUNT(fileset)']
     category_text = f"Uploaded from {src}"
     log_text = f"Completed loading DAT file, filename '{filepath}', size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
-    
-    if not conn.commit():
-        print("Inserting failed")
+
+    try:
+        conn.commit()
+    except Exception as e:
+        conn.rollback()
+        print("Inserting failed:", e)
     else:
         user = f'cli:{getpass.getuser()}'
-        create_log(escape_string(conn, category_text), user, escape_string(conn, log_text))
+        create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
 def compare_filesets(id1, id2, conn):
     with conn.cursor() as cursor:
diff --git a/fileset.py b/fileset.py
index 78966a6..6338587 100644
--- a/fileset.py
+++ b/fileset.py
@@ -284,5 +284,21 @@ def games_list():
     }
     return render_template_string(create_page(filename, 25, records_table, select_query, order, filters, mapping))
 
+ at app.route('/logs')
+def logs():
+    filename = "logs"
+    records_table = "log"
+    select_query = "SELECT id, `timestamp`, category, user, `text` FROM log"
+    order = "ORDER BY `timestamp` DESC, id DESC"
+    filters = {
+        'id': 'log',
+        'timestamp': 'log',
+        'category': 'log',
+        'user': 'log',
+        'text': 'log'
+    }
+    return render_template_string(create_page(filename, 25, records_table, select_query, order, filters))
+
+
 if __name__ == '__main__':
     app.run()
\ No newline at end of file
diff --git a/pagination.py b/pagination.py
index 20b7004..d43fefa 100644
--- a/pagination.py
+++ b/pagination.py
@@ -105,15 +105,15 @@ def create_page(filename, results_per_page, records_table, select_query, order,
 
     # Generate HTML
     html = f"""
-    <!DOCTYPE html>
-        <html>
-        <head>
-            <link rel="stylesheet" type="text/css" href="{{{{ url_for('static', filename='style.css') }}}}">
-        </head>
-        <body>
-    <form id='filters-form' method='GET' onsubmit='remove_empty_inputs()'>
-    <table>
-    """
+<!DOCTYPE html>
+    <html>
+    <head>
+        <link rel="stylesheet" type="text/css" href="{{{{ url_for('static', filename='style.css') }}}}">
+    </head>
+    <body>
+<form id='filters-form' method='GET' onsubmit='remove_empty_inputs()'>
+<table>
+"""
     if not results:
         return "No results for given filters"
     if results:
@@ -132,26 +132,67 @@ def create_page(filename, results_per_page, records_table, select_query, order,
             if key == 'fileset':
                 continue
             vars = "&".join([f"{k}={v}" for k, v in request.args.items() if k != 'sort'])
-            if f"&sort={key}" not in vars:
-                html += f"<th><a href='{filename}?{vars}&sort={key}'>{key}</a></th>"
+            sort = request.args.get('sort', '')
+            if sort == key:
+                vars += f"&sort={key}-desc"
             else:
-                html += f"<th><a href='{filename}?{vars}'>{key}</a></th>"
+                vars += f"&sort={key}"
+            html += f"<th><a href='{filename}?{vars}'>{key}</a></th>"
 
         counter = offset + 1
         for row in results:
+            if counter == offset + 1:  # If it is the first run of the loop
+                if filters:
+                    html += "<tr class='filter'><td></td>"
+                    for key in row.keys():
+                        if key not in filters:
+                            html += "<td class='filter'></td>"
+                            continue
+
+                        # Filter textbox
+                        filter_value = request.args.get(key, "")
+
+                        html += f"<td class='filter'><input type='text' class='filter' placeholder='{key}' name='{key}' value='{filter_value}'/></td>\n"
+                    html += "</tr>"
+                    html += "<tr class='filter'><td></td><td class='filter'><input type='submit' value='Submit'></td></tr>"
+
+                html += "<th></th>\n"  # Numbering column
+                for key in row.keys():
+                    if key == 'fileset':
+                        continue
+
+                    # Preserve GET variables
+                    vars = "&".join([f"{k}={v}" for k, v in request.args.items() if k != 'sort'])
+                    if request.args.get('sort', '') == key:
+                        vars += f"&sort={key}-desc"
+                    else:
+                        vars += f"&sort={key}"
+
+                    if f"&sort={key}" not in vars:
+                        html += f"<th><a href='{filename}?{vars}&sort={key}'>{key}</th>\n"
+                    else:
+                        html += f"<th><a href='{filename}?{vars}'>{key}</th>\n"
+
             if filename in ['games_list', 'user_games_list']:
-                html += f"<tr class='games_list' onclick='hyperlink(\"fileset?id={row['fileset']}\")'>"
+                html += f"<tr class='games_list' onclick='hyperlink(\"fileset?id={row['fileset']}\")'>\n"
             else:
-                html += "<tr>"
-            html += f"<td>{counter}.</td>"
+                html += "<tr>\n"
+            html += f"<td>{counter}.</td>\n"
             for key, value in row.items():
                 if key == 'fileset':
                     continue
-                matches = re.search(r"Fileset:(\d+)", value)
-                if matches:
-                    value = re.sub(r"Fileset:(\d+)", f"<a href='fileset?id={matches.group(1)}'>Fileset:{matches.group(1)}</a>", value)
-                html += f"<td>{value}</td>"
-            html += "</tr>"
+
+                # Add links to fileset in logs table
+                if isinstance(value, str):
+                    matches = re.search(r"Fileset:(\d+)", value)
+                    if matches:
+                        fileset_id = matches.group(1)
+                        fileset_text = matches.group(0)
+                        value = value.replace(fileset_text, f"<a href='fileset?id={fileset_id}'>{fileset_text}</a>")
+
+                html += f"<td>{value}</td>\n"
+            html += "</tr>\n"
+
             counter += 1
 
     html += "</table></form>"


Commit: f3ab4dd4022bedd8b6a00e33fd0102d2674015d1
    https://github.com/scummvm/scummvm-sites/commit/f3ab4dd4022bedd8b6a00e33fd0102d2674015d1
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:41:04+08:00

Commit Message:
INTEGRITY: Remove the commit of read operation

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 586c342..6d53ab1 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -170,14 +170,12 @@ def delete_filesets(conn):
 def create_log(category, user, text, conn):
     query = f"INSERT INTO log (`timestamp`, category, user, `text`) VALUES (FROM_UNIXTIME({int(time.time())}), '{escape_string(category)}', '{escape_string(user)}', '{escape_string(text)}')"
     with conn.cursor() as cursor:
-        cursor.execute(query)
-        cursor.execute("SELECT LAST_INSERT_ID()")
-        log_last = cursor.fetchone()['LAST_INSERT_ID()']
-
-    try:
-        conn.commit()
-    except:
-        print("Creating log failed")
+        try:
+            cursor.execute(query)
+            cursor.execute("SELECT LAST_INSERT_ID()")
+            log_last = cursor.fetchone()['LAST_INSERT_ID()']
+        except:
+            print("Creating log failed")
 
     return log_last
 
@@ -264,15 +262,14 @@ def db_insert(data_arr):
     if detection:
         conn.cursor().execute("UPDATE fileset SET status = 'obsolete' WHERE `timestamp` != FROM_UNIXTIME(@fileset_time_last) AND status = 'detection'")
     cur = conn.cursor()
-    cur.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
+    
     fileset_insertion_count = cur.fetchone()['COUNT(fileset)']
     category_text = f"Uploaded from {src}"
     log_text = f"Completed loading DAT file, filename '{filepath}', size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
 
     try:
-        conn.commit()
+        cur.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
     except Exception as e:
-        conn.rollback()
         print("Inserting failed:", e)
     else:
         user = f'cli:{getpass.getuser()}'
@@ -365,31 +362,32 @@ def find_matching_game(game_files):
 def merge_filesets(detection_id, dat_id):
     conn = db_connect()
 
-    with conn.cursor() as cursor:
-        cursor.execute(f"SELECT DISTINCT(filechecksum.checksum), checksize, checktype FROM filechecksum JOIN file on file.id = filechecksum.file WHERE fileset = '{detection_id}'")
-        detection_files = cursor.fetchall()
-
-        for file in detection_files:
-            checksum = file[0]
-            checksize = file[1]
-            checktype = file[2]
-
-            cursor.execute(f"DELETE FROM file WHERE checksum = '{checksum}' AND fileset = {detection_id} LIMIT 1")
+    try:
+        with conn.cursor() as cursor:
+            cursor.execute(f"SELECT DISTINCT(filechecksum.checksum), checksize, checktype FROM filechecksum JOIN file on file.id = filechecksum.file WHERE fileset = '{detection_id}'")
+            detection_files = cursor.fetchall()
 
-            cursor.execute(f"UPDATE file JOIN filechecksum ON filechecksum.file = file.id SET detection = TRUE, checksize = {checksize}, checktype = '{checktype}' WHERE fileset = '{dat_id}' AND filechecksum.checksum = '{checksum}'")
+            for file in detection_files:
+                checksum = file[0]
+                checksize = file[1]
+                checktype = file[2]
 
-        cursor.execute(f"INSERT INTO history (`timestamp`, fileset, oldfileset) VALUES (FROM_UNIXTIME({int(time.time())}), {dat_id}, {detection_id})")
-        cursor.execute("SELECT LAST_INSERT_ID()")
-        history_last = cursor.fetchone()['LAST_INSERT_ID()']
+                cursor.execute(f"DELETE FROM file WHERE checksum = '{checksum}' AND fileset = {detection_id} LIMIT 1")
+                cursor.execute(f"UPDATE file JOIN filechecksum ON filechecksum.file = file.id SET detection = TRUE, checksize = {checksize}, checktype = '{checktype}' WHERE fileset = '{dat_id}' AND filechecksum.checksum = '{checksum}'")
 
-        cursor.execute(f"UPDATE history SET fileset = {dat_id} WHERE fileset = {detection_id}")
+            cursor.execute(f"INSERT INTO history (`timestamp`, fileset, oldfileset) VALUES (FROM_UNIXTIME({int(time.time())}), {dat_id}, {detection_id})")
+            cursor.execute("SELECT LAST_INSERT_ID()")
+            history_last = cursor.fetchone()['LAST_INSERT_ID()']
 
-        cursor.execute(f"DELETE FROM fileset WHERE id = {detection_id}")
+            cursor.execute(f"UPDATE history SET fileset = {dat_id} WHERE fileset = {detection_id}")
+            cursor.execute(f"DELETE FROM fileset WHERE id = {detection_id}")
 
-        try:
-            conn.commit()
-        except:
-            print("Error merging filesets")
+        conn.commit()
+    except Exception as e:
+        conn.rollback()
+        print(f"Error merging filesets: {e}")
+    finally:
+        conn.close()
 
     return history_last
 
@@ -452,5 +450,7 @@ def populate_matching_games():
             # Add log id to the history table
             cursor.execute(f"UPDATE history SET log = {log_last} WHERE id = {history_last}")
 
-        if not conn.commit():
+        try:
+            conn.commit()
+        except:
             print("Updating matched games failed")
\ No newline at end of file


Commit: 7ac0958b105230b22d7d7b5bea2cd954b38e1305
    https://github.com/scummvm/scummvm-sites/commit/7ac0958b105230b22d7d7b5bea2cd954b38e1305
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:41:04+08:00

Commit Message:
INTEGRITY: Fix create_log in db_functions

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 6d53ab1..ca46900 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -172,11 +172,14 @@ def create_log(category, user, text, conn):
     with conn.cursor() as cursor:
         try:
             cursor.execute(query)
+            conn.commit()
+        except Exception as e:
+            conn.rollback()
+            print(f"Creating log failed: {e}")
+            log_last = None
+        else:
             cursor.execute("SELECT LAST_INSERT_ID()")
             log_last = cursor.fetchone()['LAST_INSERT_ID()']
-        except:
-            print("Creating log failed")
-
     return log_last
 
 def calc_key(fileset):


Commit: 41541b38b495bcf2c8953da297fc158ea4e9e5b7
    https://github.com/scummvm/scummvm-sites/commit/41541b38b495bcf2c8953da297fc158ea4e9e5b7
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:41:04+08:00

Commit Message:
INTEGRITY: Add index page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 6338587..14637fb 100644
--- a/fileset.py
+++ b/fileset.py
@@ -21,6 +21,31 @@ conn = pymysql.connect(
     autocommit=False
 )
 
+ at app.route('/')
+def index():
+    html = """
+    <!DOCTYPE html>
+    <html>
+    <head>
+        <link rel="stylesheet" type="text/css" href="{{ url_for('static', filename='style.css') }}">
+    </head>
+    <body>
+    <h1>Fileset Database</h1>
+    <h2>Fileset Actions</h2>
+    <ul>
+        <li><a href="{{ url_for('fileset') }}">Fileset</a></li>
+        <li><a href="{{ url_for('user_games_list') }}">User Games List</a></li>
+        <li><a href="{{ url_for('games_list') }}">Games List</a></li>
+    </ul>
+    <h2>Logs</h2>
+    <ul>
+        <li><a href="{{ url_for('logs') }}">Logs</a></li>
+    </ul>
+    </body>
+    </html>
+    """
+    return render_template_string(html)
+
 @app.route('/fileset', methods=['GET', 'POST'])
 def fileset():
     id = request.args.get('id', default = 1, type = int)


Commit: a4564b15158b25de89d5b86c3f7774e79b6dba3e
    https://github.com/scummvm/scummvm-sites/commit/a4564b15158b25de89d5b86c3f7774e79b6dba3e
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-11T19:41:04+08:00

Commit Message:
INTEGRITY: Delete original php files

Changed paths:
  R bin/dat_parser.php
  R bin/schema.php
  R bin/seeds.php
  R endpoints/validate.php
  R fileset.php
  R games_list.php
  R include/db_functions.php
  R include/pagination.php
  R include/user_fileset_functions.php
  R index.php
  R logs.php
  R mod_actions.php
  R user_games_list.php


diff --git a/bin/dat_parser.php b/bin/dat_parser.php
deleted file mode 100644
index 92cd848..0000000
--- a/bin/dat_parser.php
+++ /dev/null
@@ -1,167 +0,0 @@
-<?php
-
-require __DIR__ . '/../include/db_functions.php';
-ini_set('memory_limit', '512M');
-
-function remove_quotes($string) {
-  // Remove quotes from value if they are present
-  if ($string[0] == "\"")
-    $string = substr($string, 1, -1);
-
-  return $string;
-}
-
-/**
- * Convert string of checksum data from rom into associated array
- * Returns array instead of updating one like map_key_values
- */
-function map_checksum_data($content_string) {
-  $arr = array();
-  $temp = preg_split('/("[^"]*")|\h+/', $content_string, -1, PREG_SPLIT_NO_EMPTY | PREG_SPLIT_DELIM_CAPTURE);
-
-  for ($i = 1; $i < count($temp); $i += 2) {
-    if ($temp[$i] == ')')
-      continue;
-
-    if ($temp[$i] == 'crc' || $temp[$i] == 'sha1')
-      continue;
-
-    $temp[$i + 1] = remove_quotes($temp[$i + 1]);
-    if ($temp[$i + 1] == ')')
-      $temp[$i + 1] = "";
-    $arr[$temp[$i]] = stripslashes($temp[$i + 1]);
-  }
-
-  return $arr;
-}
-
-/**
- * Convert string as received by regex parsing to associated array
- */
-function map_key_values($content_string, &$arr) {
-
-  // Split by newline into different pairs
-  $temp = preg_split("/\r\n|\n|\r/", $content_string);
-
-  // Add pairs to the associated array if they are not parantheses
-  foreach ($temp as $pair) {
-    if (trim($pair) == "(" or trim($pair) == ")")
-      continue;
-    $pair = array_map("trim", preg_split("/ +/", $pair, 2));
-    $pair[1] = remove_quotes($pair[1]);
-
-    // Handle duplicate keys (if the key is rom) and add values to a arary instead
-    if ($pair[0] == "rom") {
-      if (array_key_exists($pair[0], $arr)) {
-        array_push($arr[$pair[0]], map_checksum_data($pair[1]));
-      }
-      else {
-        $arr[$pair[0]] = array(map_checksum_data($pair[1]));
-      }
-    }
-    else {
-      $arr[$pair[0]] = stripslashes($pair[1]);
-    }
-  }
-}
-
-/**
- * Parse DAT file and separate the contents each segment into an array
- * Segments are of the form `scummvm ( )`, `game ( )` etc.
- */
-function match_outermost_brackets($input) {
-  $matches = array();
-  $depth = 0;
-  $inside_quotes = false;
-  $cur_index = 0;
-
-  for ($i = 0; $i < strlen($input); $i++) {
-    $char = $input[$i];
-
-    if ($char == '(' && !$inside_quotes) {
-      if ($depth === 0) {
-        $cur_index = $i;
-      }
-      $depth++;
-    }
-    elseif ($char == ')' && !$inside_quotes) {
-      $depth--;
-      if ($depth === 0) {
-        $match = substr($input, $cur_index, $i - $cur_index + 1);
-        array_push($matches, array($match, $cur_index));
-      }
-    }
-    elseif ($char == '"' && $input[$i - 1] != '\\') {
-      $inside_quotes = !$inside_quotes;
-    }
-  }
-
-  return $matches;
-}
-
-/**
- * Take DAT filepath as input and return parsed data in the form of
- * associated arrays
- */
-function parse_dat($dat_filepath) {
-  $dat_file = fopen($dat_filepath, "r") or die("Unable to open file!");
-  $content = fread($dat_file, filesize($dat_filepath));
-  fclose($dat_file);
-
-  if (!$content) {
-    error_log("File not readable");
-  }
-
-  $header = array();
-  $game_data = array();
-  $resources = array();
-
-  $matches = match_outermost_brackets($content);
-  if ($matches) {
-    foreach ($matches as $data_segment) {
-      if (strpos(substr($content, $data_segment[1] - 11, 11), "clrmamepro") !== false ||
-        strpos(substr($content, $data_segment[1] - 8, 8), "scummvm") !== false) {
-        map_key_values($data_segment[0], $header);
-      }
-      elseif (strpos(substr($content, $data_segment[1] - 5, $data_segment[1]), "game") !== false) {
-        $temp = array();
-        map_key_values($data_segment[0], $temp);
-        array_push($game_data, $temp);
-      }
-      elseif (strpos(substr($content, $data_segment[1] - 9, $data_segment[1]), "resource") !== false) {
-        $temp = array();
-        map_key_values($data_segment[0], $temp);
-        $resources[$temp["name"]] = $temp;
-      }
-    }
-
-  }
-
-  // Print statements for debugging
-  // Uncomment to see parsed data
-
-  // echo "<pre>";
-  // print_r($header);
-  // print_r($game_data);
-  // print_r($resources);
-  // echo "</pre>";
-
-  return array($header, $game_data, $resources, $dat_filepath);
-}
-
-// Process command line args
-if ($index = array_search("--upload", $argv)) {
-  foreach (array_slice($argv, $index + 1) as $filepath) {
-    if ($filepath == "--match")
-      continue;
-
-    db_insert(parse_dat($filepath));
-  }
-}
-
-if (in_array("--match", $argv)) {
-  populate_matching_games();
-}
-
-?>
-
diff --git a/bin/schema.php b/bin/schema.php
deleted file mode 100644
index 108f16d..0000000
--- a/bin/schema.php
+++ /dev/null
@@ -1,246 +0,0 @@
-<?php
-
-$mysql_cred = json_decode(file_get_contents(__DIR__ . '/../mysql_config.json'), true);
-$servername = $mysql_cred["servername"];
-$username = $mysql_cred["username"];
-$password = $mysql_cred["password"];
-$dbname = $mysql_cred["dbname"];
-
-// Create connection
-$conn = new mysqli($servername, $username, $password);
-
-// Check connection
-if ($conn->connect_errno) {
-  die("Connect failed: " . $conn->connect_error);
-}
-
-// Create database
-$sql = "CREATE DATABASE IF NOT EXISTS " . $dbname;
-if ($conn->query($sql) === TRUE) {
-  echo "Database created successfully\n";
-}
-else {
-  echo "Error creating database: " . $conn->error;
-  exit();
-}
-
-$conn->query("USE " . $dbname);
-
-
-///////////////////////// CREATE TABLES /////////////////////////
-
-// Create engine table
-$table = "CREATE TABLE IF NOT EXISTS engine (
-  id INT AUTO_INCREMENT PRIMARY KEY,
-  name VARCHAR(200),
-  engineid VARCHAR(100) NOT NULL
-)";
-
-if ($conn->query($table) === TRUE) {
-  echo "Table 'engine' created successfully\n";
-}
-else {
-  echo "Error creating 'engine' table: " . $conn->error;
-}
-
-// Create game table
-$table = "CREATE TABLE IF NOT EXISTS game (
-  id INT AUTO_INCREMENT PRIMARY KEY,
-  name VARCHAR(200),
-  engine INT NOT NULL,
-  gameid VARCHAR(100) NOT NULL,
-  extra VARCHAR(200),
-  platform VARCHAR(30),
-  language VARCHAR(10),
-  FOREIGN KEY (engine) REFERENCES engine(id)
-)";
-
-if ($conn->query($table) === TRUE) {
-  echo "Table 'game' created successfully\n";
-}
-else {
-  echo "Error creating 'game' table: " . $conn->error;
-}
-
-// Create fileset table
-$table = "CREATE TABLE IF NOT EXISTS fileset (
-  id INT AUTO_INCREMENT PRIMARY KEY,
-  game INT,
-  status VARCHAR(20),
-  src VARCHAR(20),
-  `key` VARCHAR(64),
-  `megakey` VARCHAR(64),
-  `delete` BOOLEAN DEFAULT FALSE NOT NULL,
-  `timestamp` TIMESTAMP NOT NULL,
-  detection_size INT,
-  FOREIGN KEY (game) REFERENCES game(id)
-)";
-
-if ($conn->query($table) === TRUE) {
-  echo "Table 'fileset' created successfully\n";
-}
-else {
-  echo "Error creating 'fileset' table: " . $conn->error;
-}
-
-// Create file table
-$table = "CREATE TABLE IF NOT EXISTS file (
-  id INT AUTO_INCREMENT PRIMARY KEY,
-  name VARCHAR(200) NOT NULL,
-  size BIGINT NOT NULL,
-  checksum VARCHAR(64) NOT NULL,
-  fileset INT NOT NULL,
-  detection BOOLEAN NOT NULL,
-  FOREIGN KEY (fileset) REFERENCES fileset(id) ON DELETE CASCADE
-)";
-
-if ($conn->query($table) === TRUE) {
-  echo "Table 'file' created successfully\n";
-}
-else {
-  echo "Error creating 'file' table: " . $conn->error;
-}
-
-// Create filechecksum table
-$table = "CREATE TABLE IF NOT EXISTS filechecksum (
-  id INT AUTO_INCREMENT PRIMARY KEY,
-  file INT NOT NULL,
-  checksize VARCHAR(10) NOT NULL,
-  checktype VARCHAR(10) NOT NULL,
-  checksum VARCHAR(64) NOT NULL,
-  FOREIGN KEY (file) REFERENCES file(id) ON DELETE CASCADE
-)";
-
-if ($conn->query($table) === TRUE) {
-  echo "Table 'filechecksum' created successfully\n";
-}
-else {
-  echo "Error creating 'filechecksum' table: " . $conn->error;
-}
-
-// Create queue table
-$table = "CREATE TABLE IF NOT EXISTS queue (
-  id INT AUTO_INCREMENT PRIMARY KEY,
-  time TIMESTAMP NOT NULL,
-  notes varchar(300),
-  fileset INT,
-  userid INT NOT NULL,
-  commit VARCHAR(64) NOT NULL,
-  FOREIGN KEY (fileset) REFERENCES fileset(id)
-)";
-
-if ($conn->query($table) === TRUE) {
-  echo "Table 'queue' created successfully\n";
-}
-else {
-  echo "Error creating 'queue' table: " . $conn->error;
-}
-
-// Create log table
-$table = "CREATE TABLE IF NOT EXISTS log (
-  id INT AUTO_INCREMENT PRIMARY KEY,
-  `timestamp` TIMESTAMP NOT NULL,
-  category VARCHAR(100) NOT NULL,
-  user VARCHAR(100) NOT NULL,
-  `text` varchar(300)
-)";
-
-if ($conn->query($table) === TRUE) {
-  echo "Table 'log' created successfully\n";
-}
-else {
-  echo "Error creating 'log' table: " . $conn->error;
-}
-
-// Create history table
-$table = "CREATE TABLE IF NOT EXISTS history (
-  id INT AUTO_INCREMENT PRIMARY KEY,
-  `timestamp` TIMESTAMP NOT NULL,
-  fileset INT NOT NULL,
-  oldfileset INT NOT NULL,
-  log INT
-)";
-
-if ($conn->query($table) === TRUE) {
-  echo "Table 'history' created successfully\n";
-}
-else {
-  echo "Error creating 'history' table: " . $conn->error;
-}
-
-// Create transactions table
-$table = "CREATE TABLE IF NOT EXISTS transactions (
-  id INT AUTO_INCREMENT PRIMARY KEY,
-  `transaction` INT NOT NULL,
-  fileset INT NOT NULL
-)";
-
-if ($conn->query($table) === TRUE) {
-  echo "Table 'transactions' created successfully\n";
-}
-else {
-  echo "Error creating 'transactions' table: " . $conn->error;
-}
-
-
-///////////////////////// CREATE INDEX /////////////////////////
-
-// Create indices for fast data retrieval
-// PK and FK are automatically indexed in InnoDB, so they are not included
-$index = "CREATE INDEX detection ON file (detection)";
-
-if ($conn->query($index) === TRUE) {
-  echo "Created index for 'file.detection'\n";
-}
-else {
-  echo "Error creating index for 'file.detection': " . $conn->error;
-}
-
-$index = "CREATE INDEX checksum ON filechecksum (checksum)";
-
-if ($conn->query($index) === TRUE) {
-  echo "Created index for 'filechecksum.checksum'\n";
-}
-else {
-  echo "Error creating index for 'filechecksum.checksum': " . $conn->error;
-}
-
-$index = "CREATE INDEX engineid ON engine (engineid)";
-
-if ($conn->query($index) === TRUE) {
-  echo "Created index for 'engine.engineid'\n";
-}
-else {
-  echo "Error creating index for 'engine.engineid': " . $conn->error;
-}
-
-$index = "CREATE INDEX fileset_key ON fileset (`key`)";
-
-if ($conn->query($index) === TRUE) {
-  echo "Created index for 'fileset.key'\n";
-}
-else {
-  echo "Error creating index for 'fileset.key': " . $conn->error;
-}
-
-$index = "CREATE INDEX status ON fileset (status)";
-
-if ($conn->query($index) === TRUE) {
-  echo "Created index for 'fileset.status'\n";
-}
-else {
-  echo "Error creating index for 'fileset.status': " . $conn->error;
-}
-
-$index = "CREATE INDEX fileset ON history (fileset)";
-
-if ($conn->query($index) === TRUE) {
-  echo "Created index for 'history.fileset'\n";
-}
-else {
-  echo "Error creating index for 'history.fileset': " . $conn->error;
-}
-
-$conn->close();
-?>
-
diff --git a/bin/seeds.php b/bin/seeds.php
deleted file mode 100644
index 2c3d75f..0000000
--- a/bin/seeds.php
+++ /dev/null
@@ -1,75 +0,0 @@
-<?php
-
-$mysql_cred = json_decode(file_get_contents(__DIR__ . '/../mysql_config.json'), true);
-$servername = $mysql_cred["servername"];
-$username = $mysql_cred["username"];
-$password = $mysql_cred["password"];
-$dbname = $mysql_cred["dbname"];
-
-// Create connection
-$conn = new mysqli($servername, $username, $password);
-
-// Check connection
-if ($conn->connect_errno) {
-  die("Connect failed: " . $conn->connect_error);
-}
-
-$conn->query("USE " . $dbname);
-
-
-///////////////////////// INSERT VALUES /////////////////////////
-
-$query = "INSERT INTO engine (name, engineid)
-VALUES ('Drascula', '1')";
-$conn->query($query);
-$conn->query("SET @engine_last = LAST_INSERT_ID()");
-
-$query = "INSERT INTO game (name, engine, gameid)
-VALUES ('Drascula: The Vampire Strikes Back', @engine_last, '1')";
-$conn->query($query);
-$conn->query("SET @game_last = LAST_INSERT_ID()");
-
-$query = "INSERT INTO file (name, size, checksum)
-VALUES ('Packet.001', '32847563', 'fac946707f07d51696a02c00cc182078')";
-$conn->query($query);
-$conn->query("SET @file_last = LAST_INSERT_ID()");
-
-$query = "INSERT INTO fileset (game, file, status, `key`)
-VALUES (@game_last, @file_last, 0, 'fac946707f07d51696a02c00cc182078')";
-$conn->query($query);
-$conn->query("SET @fileset_last = LAST_INSERT_ID()");
-
-// Checksize: 0 (full checksum)
-$query = "INSERT INTO filechecksum (file, checksize, checktype, checksum)
-VALUES (@file_last, '0', 'md5', 'fac946707f07d51696a02c00cc182078')";
-$conn->query($query);
-$conn->query("SET @filechecksum_last = LAST_INSERT_ID()");
-
-$query = "INSERT INTO fileset_detection (fileset, checksum)
-VALUES (@fileset_last, @filechecksum_last)";
-$conn->query($query);
-
-// Checksize: 5000B
-$query = "INSERT INTO filechecksum (file, checksize, checktype, checksum)
-VALUES (@file_last, '5000', 'md5', 'c6a8697396e213a18472542d5f547cb4')";
-$conn->query($query);
-$conn->query("SET @filechecksum_last = LAST_INSERT_ID()");
-
-$query = "INSERT INTO fileset_detection (fileset, checksum)
-VALUES (@fileset_last, @filechecksum_last)";
-$conn->query($query);
-
-// Checksize: 10000B
-$query = "INSERT INTO filechecksum (file, checksize, checktype, checksum)
-VALUES (@file_last, '10000', 'md5', '695f4152f02b8fa4c1374a0ed04cf996')";
-$conn->query($query);
-$conn->query("SET @filechecksum_last = LAST_INSERT_ID()");
-
-$query = "INSERT INTO fileset_detection (fileset, checksum)
-VALUES (@fileset_last, @filechecksum_last)";
-$conn->query($query);
-
-
-$conn->close();
-?>
-
diff --git a/endpoints/validate.php b/endpoints/validate.php
deleted file mode 100644
index 1963a5c..0000000
--- a/endpoints/validate.php
+++ /dev/null
@@ -1,166 +0,0 @@
-<?php
-require __DIR__ . '/../include/user_fileset_functions.php';
-
-header('Access-Contol-Allow-Origin: *');
-header('Content-Type: application/json');
-
-$conn = db_connect();
-
-$error_codes = array(
-  "unknown" => -1,
-  "success" => 0,
-  "empty" => 2,
-  "no_metadata" => 3,
-);
-
-$json_string = file_get_contents('php://input');
-$json_object = json_decode($json_string);
-
-$ip = $_SERVER['REMOTE_ADDR'];
-// Take only first 3 bytes, set 4th byte as '.X'
-// FIXME: Assumes IPv4
-$ip = implode('.', array_slice(explode('.', $ip), 0, 3)) . '.X';
-
-$game_metadata = array();
-foreach ($json_object as $key => $value) {
-  if ($key == 'files')
-    continue;
-
-  $game_metadata[$key] = $value;
-}
-
-$json_response = array(
-  'error' => $error_codes['success'],
-  'files' => array()
-);
-
-if (count($game_metadata) == 0) {
-  if (count($json_object->files) == 0) {
-    $json_response['error'] = $error_codes['empty'];
-    unset($json_response['files']);
-    $json_response['status'] = 'empty_fileset';
-
-
-    $json_response = json_encode($json_response);
-    echo $json_response;
-    return;
-  }
-
-  $json_response['error'] = $error_codes['no_metadata'];
-  unset($json_response['files']);
-  $json_response['status'] = 'no_metadata';
-
-  $fileset_id = user_insert_fileset($json_object->files, $ip, $conn);
-  $json_response['fileset'] = $fileset_id;
-
-  $json_response = json_encode($json_response);
-  echo $json_response;
-  return;
-}
-
-// Find game(s) that fit the metadata
-$query = "SELECT game.id FROM game
-JOIN engine ON game.engine = engine.id
-WHERE gameid = '{$game_metadata['gameid']}'
-AND engineid = '{$game_metadata['engineid']}'
-AND platform = '{$game_metadata['platform']}'
-AND language = '{$game_metadata['language']}'";
-$games = $conn->query($query);
-
-if ($games->num_rows == 0) {
-  $json_response['error'] = $error_codes['unknown'];
-  unset($json_response['files']);
-  $json_response['status'] = 'unknown_variant';
-
-  $fileset_id = user_insert_fileset($json_object->files, $ip, $conn);
-  $json_response['fileset'] = $fileset_id;
-}
-
-// Check if all files in the (first) fileset are present with user
-while ($game = $games->fetch_array()) {
-  $fileset = $conn->query("SELECT file.id, name, size FROM file
-  JOIN fileset ON fileset.id = file.fileset
-  WHERE fileset.game = {$game['id']} AND
-  (status = 'fullmatch' OR status = 'partialmatch' OR status = 'detection')");
-
-  if ($fileset->num_rows == 0)
-    continue;
-
-  // Convert checktype, checksize to checkcode
-  $fileset = $fileset->fetch_all(MYSQLI_ASSOC);
-  foreach (array_values($fileset) as $index => $file) {
-    $spec_checksum_res = $conn->query("SELECT checksum, checksize, checktype
-    FROM filechecksum WHERE file = {$file['id']}");
-
-    while ($spec_checksum = $spec_checksum_res->fetch_assoc()) {
-      $fileset[$index][$spec_checksum['checktype'] . '-' . $spec_checksum['checksize']] = $spec_checksum['checksum'];
-    }
-  }
-
-  $file_object = $json_object->files;
-
-  // Sort the filesets by filename
-  usort($file_object, function ($a, $b) {
-    return strcmp($a->name, $b->name);
-  });
-  usort($fileset, function ($a, $b) {
-    return strcmp($a['name'], $b['name']);
-  });
-
-  for ($i = 0, $j = 0; $i < count($fileset) && $j < count($file_object); $i++, $j++) {
-    $status = 'ok';
-    $db_file = $fileset[$i];
-    $user_file = $file_object[$j];
-    $filename = strtolower($user_file->name);
-
-    if (strtolower($db_file['name']) != $filename) {
-      if (strtolower($db_file['name']) > $filename) {
-        $status = 'unknown_file';
-        $i--; // Retain same db_file for next iteration
-      }
-      else {
-        $status = 'missing';
-        $filename = $db_file['name'];
-        $j--; // Retain same user_file for next iteration
-      }
-    }
-    elseif ($db_file['size'] != $user_file->size && $status == 'ok') {
-      $status = 'size_mismatch';
-    }
-
-    if ($status == 'ok') {
-      foreach ($user_file->checksums as $checksum_data) {
-        foreach ($checksum_data as $key => $value) {
-          $user_checkcode = $checksum_data->type;
-          // If it's not the full checksum
-          if (strpos($user_checkcode, '-') !== false)
-            continue;
-
-          $user_checksum = $checksum_data->checksum;
-          $user_checkcode .= '-0';
-
-          if (strcasecmp($db_file[$user_checkcode], $user_checksum) != 0)
-            $status = 'checksum_mismatch';
-
-          break;
-        }
-      }
-    }
-
-    if ($status != 'ok') {
-      $json_response['error'] = 1;
-
-      $fileset_id = user_insert_fileset($json_object->files, $ip, $conn);
-      $json_response['fileset'] = $fileset_id;
-    }
-
-    array_push($json_response['files'], array('status' => $status, 'name' => $filename));
-  }
-
-  break;
-}
-
-$json_response = json_encode($json_response);
-echo $json_response;
-?>
-
diff --git a/fileset.php b/fileset.php
deleted file mode 100644
index 73a09e5..0000000
--- a/fileset.php
+++ /dev/null
@@ -1,215 +0,0 @@
-<?php
-require __DIR__ . '/include/pagination.php';
-require __DIR__ . '/include/user_fileset_functions.php';
-
-$filename = 'fileset.php';
-$stylesheet = 'style.css';
-$jquery_file = 'https://code.jquery.com/jquery-3.7.0.min.js';
-$js_file = 'js_functions.js';
-echo "<link rel='stylesheet' href='{$stylesheet}'>\n";
-echo "<script type='text/javascript' src='{$jquery_file}'></script>\n";
-echo "<script type='text/javascript' src='{$js_file}'></script>\n";
-
-
-$mysql_cred = json_decode(file_get_contents(__DIR__ . '/mysql_config.json'), true);
-$servername = $mysql_cred["servername"];
-$username = $mysql_cred["username"];
-$password = $mysql_cred["password"];
-$dbname = $mysql_cred["dbname"];
-
-// Create connection
-mysqli_report(MYSQLI_REPORT_ERROR | MYSQLI_REPORT_STRICT);
-$conn = new mysqli($servername, $username, $password);
-$conn->set_charset('utf8mb4');
-$conn->autocommit(FALSE);
-
-// Check connection
-if ($conn->connect_errno) {
-  die("Connect failed: " . $conn->connect_error);
-}
-
-$conn->query("USE " . $dbname);
-
-$min_id = $conn->query("SELECT MIN(id) FROM fileset")->fetch_array()[0];
-if (!isset($_GET['id'])) {
-  $id = $min_id;
-}
-else {
-  $max_id = $conn->query("SELECT MAX(id) FROM fileset")->fetch_array()[0];
-  $id = max($min_id, min($_GET['id'], $max_id));
-  if ($conn->query("SELECT id FROM fileset WHERE id = {$id}")->num_rows == 0)
-    $id = $conn->query("SELECT fileset FROM history WHERE oldfileset = {$id}")->fetch_array()[0];
-}
-
-$history = $conn->query("SELECT `timestamp`, oldfileset, log
-FROM history WHERE fileset = {$id}
-ORDER BY `timestamp`");
-
-
-// Display fileset details
-echo "<h2><u>Fileset: {$id}</u></h2>";
-
-$result = $conn->query("SELECT * FROM fileset WHERE id = {$id}")->fetch_assoc();
-
-echo "<h3>Fileset details</h3>";
-echo "<table>\n";
-if ($result['game']) {
-  $temp = $conn->query("SELECT game.name as 'game name', engineid, gameid, extra, platform, language
-FROM fileset JOIN game ON game.id = fileset.game JOIN engine ON engine.id = game.engine
-WHERE fileset.id = {$id}");
-  $result = array_merge($result, $temp->fetch_assoc());
-}
-else {
-  unset($result['key']);
-  unset($result['status']);
-  unset($result['delete']);
-}
-
-foreach (array_keys($result) as $column) {
-  if ($column == 'id' || $column == 'game')
-    continue;
-
-  echo "<th>{$column}</th>\n";
-}
-
-echo "<tr>\n";
-foreach ($result as $column => $value) {
-  if ($column == 'id' || $column == 'game')
-    continue;
-
-  echo "<td>{$value}</td>";
-}
-echo "</tr>\n";
-echo "</table>\n";
-
-echo "<h3>Files in the fileset</h3>";
-echo "<form>";
-// Preserve GET variables on form submit
-foreach ($_GET as $k => $v) {
-  if ($k == 'widetable')
-    continue;
-
-  $k = htmlspecialchars($k);
-  $v = htmlspecialchars($v);
-  echo "<input type='hidden' name='{$k}' value='{$v}'>";
-}
-
-// Come up with a better solution to set widetable=true on button click
-// Currently uses hidden text input
-if (isset($_GET['widetable']) && $_GET['widetable'] == 'true') {
-  echo "<input class='hidden' type='text' name='widetable' value='false' />";
-  echo "<input type='submit' value='Hide extra checksums' />";
-}
-else {
-  echo "<input class='hidden' type='text' name='widetable' value='true' />";
-  echo "<input type='submit' value='Expand Table' />";
-}
-
-echo "</form>";
-
-// Table
-echo "<table>\n";
-
-$result = $conn->query("SELECT file.id, name, size, checksum, detection
-  FROM file WHERE fileset = {$id}")->fetch_all(MYSQLI_ASSOC);
-
-if (isset($_GET['widetable']) && $_GET['widetable'] == 'true') {
-  foreach (array_values($result) as $index => $file) {
-    $spec_checksum_res = $conn->query("SELECT checksum, checksize, checktype
-    FROM filechecksum WHERE file = {$file['id']}");
-
-    while ($spec_checksum = $spec_checksum_res->fetch_assoc()) {
-      // md5-0 is skipped since it is already shown as file.checksum
-      if ($spec_checksum['checksize'] == 0)
-        continue;
-
-      $result[$index][$spec_checksum['checktype'] . '-' . $spec_checksum['checksize']] = $spec_checksum['checksum'];
-    }
-  }
-}
-
-$counter = 1;
-foreach ($result as $row) {
-  if ($counter == 1) {
-    echo "<th/>\n"; // Numbering column
-    foreach (array_keys($row) as $index => $key) {
-      if ($key == 'id')
-        continue;
-
-      echo "<th>{$key}</th>\n";
-    }
-  }
-
-  echo "<tr>\n";
-  echo "<td>{$counter}.</td>\n";
-  foreach ($row as $key => $value) {
-    if ($key == 'id')
-      continue;
-
-    echo "<td>{$value}</td>\n";
-  }
-  echo "</tr>\n";
-
-  $counter++;
-}
-echo "</table>\n";
-
-// Dev Actions
-echo "<h3>Developer Actions</h3>";
-echo "<button id='delete-button' type='button' onclick='delete_id({$id})'>Mark Fileset for Deletion</button>";
-echo "<button id='match-button' type='button' onclick='match_id({$id})'>Match and Merge Fileset</button>";
-
-if (isset($_POST['delete'])) {
-  $conn->query("UPDATE fileset SET `delete` = TRUE WHERE id = {$_POST['delete']}");
-  $conn->commit();
-}
-if (isset($_POST['match'])) {
-  match_and_merge_user_filesets($_POST['match']);
-  header("Location: {$filename}?id={$_POST['match']}");
-}
-
-echo "<p id='delete-confirm' class='hidden'>Fileset marked for deletion</p>"; // Hidden
-
-
-// Display history and logs
-echo "<h3>Fileset history</h3>";
-
-echo "<table>\n";
-echo "<th>Timestamp</th>";
-echo "<th>Category</th>";
-echo "<th>Description</th>";
-echo "<th>Log ID</th>";
-
-$logs = $conn->query("SELECT `timestamp`, category, `text`, id FROM log
-WHERE `text` REGEXP 'Fileset:{$id}'
-ORDER BY `timestamp` DESC, id DESC");
-
-while ($row = $logs->fetch_assoc()) {
-  echo "<tr>\n";
-  echo "<td>{$row['timestamp']}</td>\n";
-  echo "<td>{$row['category']}</td>\n";
-  echo "<td>{$row['text']}</td>\n";
-  echo "<td><a href='logs.php?id={$row['id']}'>{$row['id']}</a></td>\n";
-  echo "</tr>\n";
-}
-
-while ($history_row = $history->fetch_assoc()) {
-  $logs = $conn->query("SELECT `timestamp`, category, `text`, id FROM log
-  WHERE `text` REGEXP 'Fileset:{$history_row['oldfileset']}'
-  AND `category` NOT REGEXP 'merge'
-  ORDER BY `timestamp` DESC, id DESC");
-
-  while ($row = $logs->fetch_assoc()) {
-    echo "<tr>\n";
-    echo "<td>{$row['timestamp']}</td>\n";
-    echo "<td>{$row['category']}</td>\n";
-    echo "<td>{$row['text']}</td>\n";
-    echo "<td><a href='logs.php?id={$row['id']}'>{$row['id']}</a></td>\n";
-    echo "</tr>\n";
-  }
-}
-
-echo "</table>\n";
-
-?>
-
diff --git a/games_list.php b/games_list.php
deleted file mode 100644
index c6950aa..0000000
--- a/games_list.php
+++ /dev/null
@@ -1,31 +0,0 @@
-<?php
-require __DIR__ . '/include/pagination.php';
-
-$filename = "games_list.php";
-$records_table = "game";
-$select_query = "SELECT engineid, gameid, extra, platform, language, game.name,
-status, fileset.id as fileset
-FROM game
-JOIN engine ON engine.id = game.engine
-JOIN fileset ON game.id = fileset.game";
-$order = "ORDER BY gameid";
-
-// Filter column => table
-$filters = array(
-  "engineid" => "engine",
-  "gameid" => "game",
-  "extra" => "game",
-  "platform" => "game",
-  "language" => "game",
-  "name" => "game",
-  "status" => "fileset"
-);
-
-$mapping = array(
-  'engine.id' => 'game.engine',
-  'game.id' => 'fileset.game',
-);
-
-create_page($filename, 25, $records_table, $select_query, $order, $filters, $mapping);
-?>
-
diff --git a/include/db_functions.php b/include/db_functions.php
deleted file mode 100644
index 81f21ff..0000000
--- a/include/db_functions.php
+++ /dev/null
@@ -1,595 +0,0 @@
-<?php
-
-/**
- * Create and return a mysqli connection
- */
-function db_connect() {
-  $mysql_cred = json_decode(file_get_contents(__DIR__ . '/../mysql_config.json'), true);
-  $servername = $mysql_cred["servername"];
-  $username = $mysql_cred["username"];
-  $password = $mysql_cred["password"];
-  $dbname = $mysql_cred["dbname"];
-
-  // Create connection
-  mysqli_report(MYSQLI_REPORT_ERROR | MYSQLI_REPORT_STRICT);
-  $conn = new mysqli($servername, $username, $password);
-  $conn->set_charset('utf8mb4');
-  $conn->autocommit(FALSE);
-
-  // Check connection
-  if ($conn->connect_errno) {
-    die("Connect failed: " . $conn->connect_error);
-  }
-
-  $conn->query("USE " . $dbname);
-
-  return $conn;
-}
-
-/**
- * Retrieves the checksum and checktype of a given type + checksum
- * eg: md5-5000 t:12345... -> 5000, md5-t, 12345...
- */
-function get_checksum_props($checkcode, $checksum) {
-  $checksize = 0;
-  $checktype = $checkcode;
-
-  if (strpos($checkcode, '-') !== false) {
-    $exploded_checkcode = explode('-', $checkcode);
-    $last = array_pop($exploded_checkcode);
-    if ($last == '1M' || is_numeric($last))
-      $checksize = $last;
-
-    $checktype = implode('-', $exploded_checkcode);
-  }
-
-  // Detection entries have checktypes as part of the checksum prefix
-  if (strpos($checksum, ':') !== false) {
-    $prefix = explode(':', $checksum)[0];
-    $checktype .= "-" . $prefix;
-
-    $checksum = explode(':', $checksum)[1];
-  }
-
-  return array($checksize, $checktype, $checksum);
-}
-
-/**
- * Routine for inserting a game into the database - inserting into engine and
- * game tables
- */
-function insert_game($engine_name, $engineid, $title, $gameid, $extra, $platform, $lang, $conn) {
-  // Set @engine_last if engine already present in table
-  $exists = false;
-  if ($res = $conn->query(sprintf("SELECT id FROM engine WHERE engineid = '%s'", $engineid))) {
-    if ($res->num_rows > 0) {
-      $exists = true;
-      $conn->query(sprintf("SET @engine_last = '%d'", $res->fetch_array()[0]));
-    }
-  }
-
-  // Insert into table if not present
-  if (!$exists) {
-    $query = sprintf("INSERT INTO engine (name, engineid)
-  VALUES ('%s', '%s')", mysqli_real_escape_string($conn, $engine_name), $engineid);
-    $conn->query($query);
-    $conn->query("SET @engine_last = LAST_INSERT_ID()");
-  }
-
-  // Insert into game
-  $query = sprintf("INSERT INTO game (name, engine, gameid, extra, platform, language)
-  VALUES ('%s', @engine_last, '%s', '%s', '%s', '%s')", mysqli_real_escape_string($conn, $title),
-    $gameid, mysqli_real_escape_string($conn, $extra), $platform, $lang);
-  $conn->query($query);
-  $conn->query("SET @game_last = LAST_INSERT_ID()");
-}
-
-function insert_fileset($src, $detection, $key, $megakey, $transaction, $log_text, $conn, $ip = '') {
-  $status = $detection ? "detection" : $src;
-  $game = "NULL";
-  $key = $key == "" ? "NULL" : "'{$key}'";
-  $megakey = $megakey == "" ? "NULL" : "'{$megakey}'";
-
-  if ($detection) {
-    $status = "detection";
-    $game = "@game_last";
-  }
-
-  // Check if key/megakey already exists, if so, skip insertion (no quotes on purpose)
-  if ($detection)
-    $existing_entry = $conn->query("SELECT id FROM fileset WHERE `key` = {$key}");
-  else
-    $existing_entry = $conn->query("SELECT id FROM fileset WHERE megakey = {$megakey}");
-
-  if ($existing_entry->num_rows > 0) {
-    $existing_entry = $existing_entry->fetch_array()[0];
-    $conn->query("SET @fileset_last = {$existing_entry}");
-
-    $category_text = "Uploaded from {$src}";
-    $log_text = "Duplicate of Fileset:{$existing_entry}, {$log_text}";
-    if ($src == 'user')
-      $log_text = "Duplicate of Fileset:{$existing_entry}, from user IP {$ip}, {$log_text}";
-
-    $user = 'cli:' . get_current_user();
-    create_log(mysqli_real_escape_string($conn, $category_text), $user, mysqli_real_escape_string($conn, $log_text));
-
-    if (!$detection)
-      return false;
-
-    $conn->query("UPDATE fileset SET `timestamp` = FROM_UNIXTIME(@fileset_time_last)
-                      WHERE id = {$existing_entry}");
-    $conn->query("UPDATE fileset SET status = 'detection'
-                    WHERE id = {$existing_entry} AND status = 'obsolete'");
-    $conn->query("DELETE FROM game WHERE id = @game_last");
-    return false;
-  }
-
-  // $game and $key should not be parsed as a mysql string, hence no quotes
-  $query = "INSERT INTO fileset (game, status, src, `key`, megakey, `timestamp`)
-  VALUES ({$game}, '{$status}', '{$src}', {$key}, {$megakey}, FROM_UNIXTIME(@fileset_time_last))";
-  $conn->query($query);
-  $conn->query("SET @fileset_last = LAST_INSERT_ID()");
-
-  $category_text = "Uploaded from {$src}";
-  $fileset_last = $conn->query("SELECT @fileset_last")->fetch_array()[0];
-  $log_text = "Created Fileset:{$fileset_last}, {$log_text}";
-  if ($src == 'user')
-    $log_text = "Created Fileset:{$fileset_last}, from user IP {$ip}, {$log_text}";
-
-  $user = 'cli:' . get_current_user();
-  create_log(mysqli_real_escape_string($conn, $category_text), $user, mysqli_real_escape_string($conn, $log_text));
-  $conn->query("INSERT INTO transactions (`transaction`, fileset) VALUES ({$transaction}, {$fileset_last})");
-
-  return true;
-}
-
-/**
- * Routine for inserting a file into the database, inserting into all
- * required tables
- * $file is an associated array (the contents of 'rom')
- * If checksum of the given checktype doesn't exists, silently fails
- */
-function insert_file($file, $detection, $src, $conn) {
-  // Find full md5, or else use first checksum value
-  $checksum = "";
-  $checksize = 5000;
-  if (isset($file["md5"])) {
-    $checksum = $file["md5"];
-  }
-  else {
-    foreach ($file as $key => $value) {
-      if (strpos($key, "md5") !== false) {
-        list($checksize, $checktype, $checksum) = get_checksum_props($key, $value);
-        break;
-      }
-    }
-  }
-
-  $query = sprintf("INSERT INTO file (name, size, checksum, fileset, detection)
-  VALUES ('%s', '%s', '%s', @fileset_last, %d)", mysqli_real_escape_string($conn, $file["name"]),
-    $file["size"], $checksum, $detection);
-  $conn->query($query);
-
-  if ($detection)
-    $conn->query("UPDATE fileset SET detection_size = {$checksize} WHERE id = @fileset_last AND detection_size IS NULL");
-  $conn->query("SET @file_last = LAST_INSERT_ID()");
-}
-
-function insert_filechecksum($file, $checktype, $conn) {
-  if (!array_key_exists($checktype, $file))
-    return;
-
-  $checksum = $file[$checktype];
-  list($checksize, $checktype, $checksum) = get_checksum_props($checktype, $checksum);
-
-  $query = sprintf("INSERT INTO filechecksum (file, checksize, checktype, checksum)
-  VALUES (@file_last, '%s', '%s', '%s')", $checksize, $checktype, $checksum);
-  $conn->query($query);
-}
-
-/**
- * Delete filesets marked for deletion
- */
-function delete_filesets($conn) {
-  $query = "DELETE FROM fileset WHERE `delete` == TRUE";
-  $conn->query($query);
-}
-
-/**
- * Create an entry to the log table on each call of db_insert() or
- * populate_matching_games()
- */
-function create_log($category, $user, $text) {
-  $conn = db_connect();
-  $conn->query(sprintf("INSERT INTO log (`timestamp`, category, user, `text`)
-  VALUES (FROM_UNIXTIME(%d), '%s', '%s', '%s')", time(), $category, $user, $text));
-  $log_last = $conn->query("SELECT LAST_INSERT_ID()")->fetch_array()[0];
-
-  if (!$conn->commit())
-    echo "Creating log failed\n";
-
-  return $log_last;
-}
-
-/**
- * Calculate `key` value as md5("name:title:...:engine:file1:size:md5:file2:...")
- */
-function calc_key($fileset) {
-  $key_string = "";
-
-  foreach ($fileset as $key => $value) {
-    if ($key == 'engineid' || $key == 'gameid' || $key == 'rom')
-      continue;
-
-    $key_string .= ':' . $value;
-  }
-
-  $files = $fileset['rom'];
-  foreach ($files as $file) {
-    foreach ($file as $key => $value) {
-      $key_string .= ':' . $value;
-    }
-  }
-
-  $key_string = trim($key_string, ':');
-  return md5($key_string);
-}
-
-/**
- * Calculate `megakey` value as md5("file1:size:md5:file2:...")
- */
-function calc_megakey($files) {
-  $key_string = "";
-  foreach ($files as $file) {
-    foreach ($file as $key => $value) {
-      $key_string .= ':' . $value;
-    }
-  }
-  $key_string = trim($key_string, ':');
-  return md5($key_string);
-}
-
-/**
- * Insert values from the associated array into the DB
- * They will be inserted under gameid NULL as the game itself is unconfirmed
- */
-function db_insert($data_arr) {
-  $header = $data_arr[0];
-  $game_data = $data_arr[1];
-  $resources = $data_arr[2];
-  $filepath = $data_arr[3];
-
-  $conn = db_connect();
-
-  /**
-   * Author can be:
-   *  scummvm -> Detection Entries
-   *  scanner -> CLI scanner tool in python
-   *  _anything else_ -> DAT file
-   */
-  $author = $header["author"];
-  $version = $header["version"];
-
-  /**
-   * status can be:
-   *  detection -> Detection entries (source of truth)
-   *  user -> Submitted by users via ScummVM, unmatched (Not used in the parser)
-   *  scan -> Submitted by cli/scanner, unmatched
-   *  dat -> Submitted by DAT, unmatched
-   *  partialmatch -> Submitted by DAT, matched
-   *  fullmatch -> Submitted by cli/scanner, matched
-   *  obsolete -> Detection entries that are no longer part of the detection set
-   */
-  $src = "";
-  if ($author == "scan" || $author == "scummvm")
-    $src = $author;
-  else
-    $src = "dat";
-
-  $detection = ($src == "scummvm");
-  $status = $detection ? "detection" : $src;
-
-  // Set timestamp of fileset insertion
-  $conn->query(sprintf("SET @fileset_time_last = %d", time()));
-
-  // Create start log entry
-  $transaction_id = $conn->query("SELECT MAX(`transaction`) FROM transactions")->fetch_array()[0] + 1;
-
-  $category_text = "Uploaded from {$src}";
-  $log_text = sprintf("Started loading DAT file, size %d, author '%s', version %s.
-  State '%s'. Transaction: %d",
-    $filepath, filesize($filepath), $author, $version, $status, $transaction_id);
-
-  $user = 'cli:' . get_current_user();
-  create_log(mysqli_real_escape_string($conn, $category_text), $user, mysqli_real_escape_string($conn, $log_text));
-
-  foreach ($game_data as $fileset) {
-    if ($detection) {
-      $engine_name = $fileset["engine"];
-      $engineid = $fileset["sourcefile"];
-      $gameid = $fileset["name"];
-      $title = $fileset["title"];
-      $extra = $fileset["extra"];
-      $platform = $fileset["platform"];
-      $lang = $fileset["language"];
-
-      insert_game($engine_name, $engineid, $title, $gameid, $extra, $platform, $lang, $conn);
-    }
-    elseif ($src == "dat")
-      if (isset($fileset['romof']) && isset($resources[$fileset['romof']]))
-        $fileset["rom"] = array_merge($fileset["rom"], $resources[$fileset["romof"]]["rom"]);
-
-    $key = $detection ? calc_key($fileset) : "";
-    $megakey = !$detection ? calc_megakey($fileset['rom']) : "";
-    $log_text = sprintf("size %d, author '%s', version %s.
-    State '%s'.",
-      filesize($filepath), $author, $version, $status);
-
-    if (insert_fileset($src, $detection, $key, $megakey, $transaction_id, $log_text, $conn)) {
-      foreach ($fileset["rom"] as $file) {
-        insert_file($file, $detection, $src, $conn);
-        foreach ($file as $key => $value) {
-          if ($key != "name" && $key != "size")
-            insert_filechecksum($file, $key, $conn);
-        }
-      }
-    }
-  }
-
-  if ($detection)
-    $conn->query("UPDATE fileset SET status = 'obsolete'
-                  WHERE `timestamp` != FROM_UNIXTIME(@fileset_time_last)
-                  AND status = 'detection'");
-
-  $fileset_insertion_count = $conn->query("SELECT COUNT(fileset) from transactions WHERE `transaction` = {$transaction_id}")->fetch_array()[0];
-  $category_text = "Uploaded from {$src}";
-  $log_text = sprintf("Completed loading DAT file, filename '%s', size %d, author '%s', version %s.
-  State '%s'. Number of filesets: %d. Transaction: %d",
-    $filepath, filesize($filepath), $author, $version, $status, $fileset_insertion_count, $transaction_id);
-
-  if (!$conn->commit())
-    echo "Inserting failed\n";
-  else {
-    $user = 'cli:' . get_current_user();
-    create_log(mysqli_real_escape_string($conn, $category_text), $user, mysqli_real_escape_string($conn, $log_text));
-  }
-}
-
-/**
- * Compare 2 dat filesets to find if they are equivalent or not
- */
-function compare_filesets($id1, $id2, $conn) {
-  $fileset1 = $conn->query("SELECT name, size, checksum
-                            FROM file WHERE fileset = '{$id1}'")->fetch_array();
-  $fileset2 = $conn->query("SELECT name, size, checksum
-                            FROM file WHERE fileset = '{$id2}'")->fetch_array();
-
-  // Sort filesets on checksum
-  usort($fileset1, function ($a, $b) {
-    return $a[2] <=> $b[2];
-  });
-  usort($fileset2, function ($a, $b) {
-    return $a[2] <=> $b[2];
-  });
-
-  if (count($fileset1) != count($fileset2))
-    return false;
-
-  for ($i = 0; $i < count($fileset1); $i++) {
-    // If checksums do not match
-    if ($fileset1[2] != $fileset2[2])
-      return false;
-  }
-
-  return True;
-}
-
-/**
- * Return fileset statuses that can be merged with set of given status
- * eg: scan and dat -> detection
- *     fullmatch -> partialmatch, detection
- */
-function status_to_match($status) {
-  $order = array("detection", "dat", "scan", "partialmatch", "fullmatch", "user");
-  return array_slice($order, 0, array_search($status, $order));
-}
-
-/**
- * Detects games based on the file descriptions in $dat_arr
- * Compares the files with those in the detection entries table
- * $game_files consists of both the game ( ) and resources ( ) parts
- */
-function find_matching_game($game_files) {
-  $matching_games = array(); // All matching games
-  $matching_filesets = array(); // All filesets containing one file from $game_files
-  $matches_count = 0; // Number of files with a matching detection entry
-
-  $conn = db_connect();
-
-  foreach ($game_files as $file) {
-    $checksum = $file[1];
-
-    $query = "SELECT file.fileset as file_fileset
-    FROM filechecksum
-    JOIN file ON filechecksum.file = file.id
-    WHERE filechecksum.checksum = '{$checksum}' AND file.detection = TRUE";
-    $records = $conn->query($query)->fetch_all();
-
-    // If file is not part of detection entries, skip it
-    if (count($records) == 0)
-      continue;
-
-    $matches_count++;
-    foreach ($records as $record)
-      array_push($matching_filesets, $record[0]);
-  } // Check if there is a fileset_id that is present in all results
-  foreach (array_count_values($matching_filesets) as $key => $value) {
-    $count_files_in_fileset = $conn->query(sprintf("SELECT COUNT(file.id) FROM file
-    JOIN fileset ON file.fileset = fileset.id
-    WHERE fileset.id = '%s'", $key))->fetch_array()[0];
-
-    // We use < instead of != since one file may have more than one entry in the fileset
-    // We see this in Drascula English version, where one entry is duplicated
-    if ($value < $matches_count || $value < $count_files_in_fileset)
-      continue;
-
-    $records = $conn->query(sprintf("SELECT engineid, game.id, gameid, platform,
-    language, `key`, src, fileset.id as fileset
-    FROM game
-    JOIN fileset ON fileset.game = game.id
-    JOIN engine ON engine.id = game.engine
-    WHERE fileset.id = '%s'", $key));
-
-    array_push($matching_games, $records->fetch_array());
-  }
-
-  if (count($matching_games) != 1)
-    return $matching_games;
-
-  // Check the current fileset priority with that of the match
-  $records = $conn->query(sprintf("SELECT id FROM fileset, ({$query}) AS res
-      WHERE id = file_fileset AND
-      status IN ('%s')", implode("', '", status_to_match($game_files[3]))));
-
-  // If priority order is correct
-  if ($records->num_rows != 0)
-    return $matching_games;
-
-  if (compare_filesets($matching_games[0]['fileset'], $game_files[0][0], $conn)) {
-    $conn->query("UPDATE fileset SET `delete` = TRUE WHERE id = {$game_files[0][0]}");
-    return array();
-  }
-
-  return $matching_games;
-}
-
-/**
- * Merge two filesets without duplicating files
- * Used after matching an unconfirmed fileset with a detection entry
- */
-function merge_filesets($detection_id, $dat_id) {
-  $conn = db_connect();
-
-  $detection_files = $conn->query(sprintf("SELECT DISTINCT(filechecksum.checksum), checksize, checktype
-  FROM filechecksum JOIN file on file.id = filechecksum.file
-  WHERE fileset = '%d'", $detection_id))->fetch_all();
-
-  foreach ($detection_files as $file) {
-    $checksum = $file[0];
-    $checksize = $file[1];
-    $checktype = $file[2];
-
-    // Delete original detection entry so newly matched fileset is the only fileset for game
-    $conn->query(sprintf("DELETE FROM file
-    WHERE checksum = '%s' AND fileset = %d LIMIT 1", $checksum, $detection_id));
-
-    // Mark files present in the detection entries
-    $conn->query(sprintf("UPDATE file
-    JOIN filechecksum ON filechecksum.file = file.id
-    SET detection = TRUE,
-    checksize = %d,
-    checktype = '%s'
-    WHERE fileset = '%d' AND filechecksum.checksum = '%s'",
-      $checksize, $checktype, $dat_id, $checksum));
-  }
-
-  // Add fileset pair to history ($dat_id is the new fileset for $detection_id)
-  $conn->query(sprintf("INSERT INTO history (`timestamp`, fileset, oldfileset)
-  VALUES (FROM_UNIXTIME(%d), %d, %d)", time(), $dat_id, $detection_id));
-  $history_last = $conn->query("SELECT LAST_INSERT_ID()")->fetch_array()[0];
-
-  $conn->query("UPDATE history SET fileset = {$dat_id} WHERE fileset = {$detection_id}");
-
-  // Delete original fileset
-  $conn->query("DELETE FROM fileset WHERE id = {$detection_id}");
-
-  if (!$conn->commit())
-    echo "Error merging filesets\n";
-
-  return $history_last;
-}
-
-/**
- * (Attempt to) match fileset that have fileset.game as NULL
- * This will delete the original detection fileset and replace it with the newly
- * matched fileset
- */
-function populate_matching_games() {
-  $conn = db_connect();
-
-  // Getting unmatched filesets
-  $unmatched_filesets = array();
-
-  $unmatched_files = $conn->query("SELECT fileset.id, filechecksum.checksum, src, status
-  FROM fileset
-  JOIN file ON file.fileset = fileset.id
-  JOIN filechecksum ON file.id = filechecksum.file
-  WHERE fileset.game IS NULL AND status != 'user'");
-  $unmatched_files = $unmatched_files->fetch_all();
-
-  // Splitting them into different filesets
-  for ($i = 0; $i < count($unmatched_files); $i++) {
-    $cur_fileset = $unmatched_files[$i][0];
-    $temp = array();
-    while ($i < count($unmatched_files) - 1 && $cur_fileset == $unmatched_files[$i][0]) {
-      array_push($temp, $unmatched_files[$i]);
-      $i++;
-    }
-    array_push($unmatched_filesets, $temp);
-  }
-
-  foreach ($unmatched_filesets as $fileset) {
-    $matching_games = find_matching_game($fileset);
-
-    if (count($matching_games) != 1) // If there is no match/non-unique match
-      continue;
-
-    $matched_game = $matching_games[0];
-
-    // Update status depending on $matched_game["src"] (dat -> partialmatch, scan -> fullmatch)
-    $status = $fileset[0][2];
-    if ($fileset[0][2] == "dat")
-      $status = "partialmatch";
-    elseif ($fileset[0][2] == "scan")
-      $status = "fullmatch";
-
-    // Convert NULL values to string with value NULL for printing
-    $matched_game = array_map(function ($val) {
-      return (is_null($val)) ? "NULL" : $val;
-    }, $matched_game);
-
-    $category_text = "Matched from {$fileset[0][2]}";
-    $log_text = "Matched game {$matched_game['engineid']}:
-    {$matched_game['gameid']}-{$matched_game['platform']}-{$matched_game['language']}
-    variant {$matched_game['key']}. State {$status}. Fileset:{$fileset[0][0]}.";
-
-    // Updating the fileset.game value to be $matched_game["id"]
-    $query = sprintf("UPDATE fileset
-    SET game = %d, status = '%s', `key` = '%s'
-    WHERE id = %d", $matched_game["id"], $status, $matched_game["key"], $fileset[0][0]);
-
-    $history_last = merge_filesets($matched_game["fileset"], $fileset[0][0]);
-
-    if ($conn->query($query)) {
-      $user = 'cli:' . get_current_user();
-
-      // Merge log
-      create_log("Fileset merge", $user,
-        mysqli_real_escape_string($conn, "Merged Fileset:{$matched_game['fileset']} and Fileset:{$fileset[0][0]}"));
-
-      // Matching log
-      $log_last = create_log(mysqli_real_escape_string($conn, $category_text), $user,
-        mysqli_real_escape_string($conn, $log_text));
-
-      // Add log id to the history table
-      $conn->query("UPDATE history SET log = {$log_last} WHERE id = {$history_last}");
-    }
-
-    if (!$conn->commit())
-      echo "Updating matched games failed\n";
-  }
-}
-
-
-?>
-
diff --git a/include/pagination.php b/include/pagination.php
deleted file mode 100644
index 7269137..0000000
--- a/include/pagination.php
+++ /dev/null
@@ -1,255 +0,0 @@
-<?php
-$stylesheet = 'style.css';
-$jquery_file = 'https://code.jquery.com/jquery-3.7.0.min.js';
-$js_file = 'js_functions.js';
-echo "<link rel='stylesheet' href='{$stylesheet}'>\n";
-echo "<script type='text/javascript' src='{$jquery_file}'></script>\n";
-echo "<script type='text/javascript' src='{$js_file}'></script>\n";
-
-/**
- * Return a string denoting which two columns link two tables
- */
-function get_join_columns($table1, $table2, $mapping) {
-  foreach ($mapping as $primary => $foreign) {
-    $primary = explode('.', $primary);
-    $foreign = explode('.', $foreign);
-    if (($primary[0] == $table1 && $foreign[0] == $table2) ||
-      ($primary[0] == $table2 && $foreign[0] == $table1))
-      return "{$primary[0]}.{$primary[1]} = {$foreign[0]}.{$foreign[1]}";
-  }
-
-  echo "No primary-foreign key mapping provided. Filter is invalid";
-}
-
-function create_page($filename, $results_per_page, $records_table, $select_query, $order, $filters = array(), $mapping = array()) {
-  $mysql_cred = json_decode(file_get_contents(__DIR__ . '/../mysql_config.json'), true);
-  $servername = $mysql_cred["servername"];
-  $username = $mysql_cred["username"];
-  $password = $mysql_cred["password"];
-  $dbname = $mysql_cred["dbname"];
-
-  // Create connection
-  mysqli_report(MYSQLI_REPORT_ERROR | MYSQLI_REPORT_STRICT);
-  $conn = new mysqli($servername, $username, $password);
-  $conn->set_charset('utf8mb4');
-  $conn->autocommit(FALSE);
-
-  // Check connection
-  if ($conn->connect_errno) {
-    die("Connect failed: " . $conn->connect_error);
-  }
-
-  $conn->query("USE " . $dbname);
-
-  // If there exist get variables that are for filtering
-  $_GET = array_filter($_GET);
-  if (isset($_GET['sort'])) {
-    $column = $_GET['sort'];
-    $column = explode('-', $column);
-    $order = "ORDER BY {$column[0]}";
-
-    if (strpos($_GET['sort'], 'desc') !== false)
-      $order .= " DESC";
-  }
-
-  if (array_diff(array_keys($_GET), array('page', 'sort'))) {
-    $condition = "WHERE ";
-    $tables = array();
-    foreach ($_GET as $key => $value) {
-      if ($key == 'page' || $key == 'sort' || $value == '')
-        continue;
-
-      array_push($tables, $filters[$key]);
-      $condition .= $condition != "WHERE " ? " AND {$filters[$key]}.{$key} REGEXP '{$value}'" : "{$filters[$key]}.{$key} REGEXP '{$value}'";
-    }
-    if ($condition == "WHERE ")
-      $condition = "";
-
-    // If more than one table is to be searched
-    $from_query = $records_table;
-    if (count($tables) > 1 || $tables[0] != $records_table)
-      for ($i = 0; $i < count($tables); $i++) {
-        if ($tables[$i] == $records_table)
-          continue;
-
-        $from_query .= sprintf(" JOIN %s ON %s", $tables[$i], get_join_columns($records_table, $tables[$i], $mapping));
-      }
-
-    $num_of_results = $conn->query(
-      "SELECT COUNT({$records_table}.id) FROM {$from_query} {$condition}")->fetch_array()[0];
-  }
-  // If $records_table has a JOIN (multiple tables)
-  elseif (preg_match("/JOIN/", $records_table) !== false) {
-    $first_table = explode(" ", $records_table)[0];
-    $num_of_results = $conn->query("SELECT COUNT({$first_table}.id) FROM {$records_table}")->fetch_array()[0];
-  }
-  else {
-    $num_of_results = $conn->query("SELECT COUNT(id) FROM {$records_table}")->fetch_array()[0];
-  }
-  $num_of_pages = ceil($num_of_results / $results_per_page);
-  if ($num_of_results == 0) {
-    echo "No results for given filters";
-    return;
-  }
-
-  if (!isset($_GET['page'])) {
-    $page = 1;
-  }
-  else {
-    $page = max(1, min($_GET['page'], $num_of_pages));
-  }
-
-  $offset = ($page - 1) * $results_per_page;
-
-  // If there exist get variables that are for filtering
-  if (array_diff(array_keys($_GET), array('page'))) {
-    $condition = "WHERE ";
-    foreach ($_GET as $key => $value) {
-      $value = mysqli_real_escape_string($conn, $value);
-      if (!isset($filters[$key]))
-        continue;
-
-      $condition .= $condition != "WHERE " ? "AND {$filters[$key]}.{$key} REGEXP '{$value}'" : "{$filters[$key]}.{$key} REGEXP '{$value}'";
-    }
-    if ($condition == "WHERE ")
-      $condition = "";
-
-    $query = "{$select_query} {$condition} {$order} LIMIT {$results_per_page} OFFSET {$offset}";
-  }
-  else {
-    $query = "{$select_query} {$order} LIMIT {$results_per_page} OFFSET {$offset}";
-  }
-  $result = $conn->query($query);
-
-
-  // Table
-  echo "<form id='filters-form' method='GET' onsubmit='remove_empty_inputs()'>";
-  echo "<table>\n";
-
-  $counter = $offset + 1;
-  while ($row = $result->fetch_assoc()) {
-    if ($counter == $offset + 1) { // If it is the first run of the loop
-      if (count($filters) > 0) {
-        echo "<tr class=filter><td></td>";
-        foreach (array_keys($row) as $key) {
-          if (!isset($filters[$key])) {
-            echo "<td class=filter />";
-            continue;
-          }
-
-          // Filter textbox
-          $filter_value = isset($_GET[$key]) ? $_GET[$key] : "";
-
-          echo "<td class=filter><input type=text class=filter placeholder='{$key}' name='{$key}' value='{$filter_value}'/></td>\n";
-        }
-        echo "</tr>";
-        echo "<tr class=filter><td></td><td class=filter><input type=submit value='Submit'></td></tr>";
-      }
-
-      echo "<th/>\n"; // Numbering column
-      foreach (array_keys($row) as $key) {
-        if ($key == 'fileset')
-          continue;
-
-        // Preserve GET variables
-        $vars = "";
-        foreach ($_GET as $k => $v) {
-          if ($k == 'sort' && $v == $key)
-            $vars .= "&{$k}={$v}-desc";
-          elseif ($k != 'sort')
-            $vars .= "&{$k}={$v}";
-        }
-
-        if (strpos($vars, "&sort={$key}") === false)
-          echo "<th><a href='{$filename}?{$vars}&sort={$key}'>{$key}</th>\n";
-        else
-          echo "<th><a href='{$filename}?{$vars}'>{$key}</th>\n";
-      }
-    }
-
-    if ($filename == 'games_list.php' || $filename == 'user_games_list.php')
-      echo "<tr class=games_list onclick='hyperlink(\"fileset.php?id={$row['fileset']}\")'>\n";
-    else
-      echo "<tr>\n";
-    echo "<td>{$counter}.</td>\n";
-    foreach ($row as $key => $value) {
-      if ($key == 'fileset')
-        continue;
-
-      // Add links to fileset in logs table
-      $matches = array();
-      if (preg_match("/Fileset:(\d+)/", $value, $matches, PREG_OFFSET_CAPTURE)) {
-        $value = substr($value, 0, $matches[0][1]) .
-          "<a href='fileset.php?id={$matches[1][0]}'>{$matches[0][0]}</a>" .
-          substr($value, $matches[0][1] + strlen($matches[0][0]));
-      }
-
-      echo "<td>{$value}</td>\n";
-    }
-    echo "</tr>\n";
-
-    $counter++;
-  }
-
-  echo "</table>\n";
-  echo "</form>\n";
-
-  // Preserve GET variables
-  $vars = "";
-  foreach ($_GET as $key => $value) {
-    if ($key == 'page')
-      continue;
-    $vars .= "&{$key}={$value}";
-  }
-
-  // Navigation elements
-  if ($num_of_pages > 1) {
-    echo "<form method='GET'>\n";
-
-    // Preserve GET variables on form submit
-    foreach ($_GET as $key => $value) {
-      if ($key == 'page')
-        continue;
-
-      $key = htmlspecialchars($key);
-      $value = htmlspecialchars($value);
-      if ($value != "")
-        echo "<input type='hidden' name='{$key}' value='{$value}'>";
-    }
-
-    echo "<div class=pagination>\n";
-    if ($page > 1) {
-      echo "<a href={$filename}?{$vars}>❮❮</a>\n";
-      echo sprintf("<a href=%s?page=%d%s>❮</a>\n", $filename, $page - 1, $vars);
-    }
-    if ($page - 2 > 1)
-      echo "<div class=more>...</div>\n";
-
-
-    for ($i = $page - 2; $i <= $page + 2; $i++) {
-      if ($i >= 1 && $i <= $num_of_pages) {
-
-        if ($i == $page)
-          echo sprintf("<a class=active href=%s?page=%d%s>%d</a>\n", $filename, $i, $vars, $i);
-        else
-          echo sprintf("<a href=%s?page=%d%s>%d</a>\n", $filename, $i, $vars, $i);
-      }
-    }
-
-    if ($page + 2 < $num_of_pages)
-      echo "<div class=more>...</div>\n";
-    if ($page < $num_of_pages) {
-      echo sprintf("<a href=%s?page=%d%s>❯</a>\n", $filename, $page + 1, $vars);
-      echo "<a href={$filename}?page={$num_of_pages}{$vars}>❯❯</a>\n";
-    }
-
-    echo "<input type='text' name='page' placeholder='Page No'>\n";
-    echo "<input type='submit' value='Submit'>\n";
-    echo "</div>\n";
-
-    echo "</form>\n";
-  }
-
-}
-?>
-
diff --git a/include/user_fileset_functions.php b/include/user_fileset_functions.php
deleted file mode 100644
index b160436..0000000
--- a/include/user_fileset_functions.php
+++ /dev/null
@@ -1,153 +0,0 @@
-<?php
-require __DIR__ . '/../include/db_functions.php';
-
-function user_calc_key($user_fileset) {
-  $key_string = "";
-  foreach ($user_fileset as $file) {
-    foreach ($file as $key => $value) {
-      if ($key != 'checksums') {
-        $key_string .= ':' . $value;
-        continue;
-      }
-
-      foreach ($value as $checksum_pair)
-        $key_string .= ':' . $checksum_pair->checksum;
-    }
-  }
-  $key_string = trim($key_string, ':');
-
-  return md5($key_string);
-}
-
-function file_json_to_array($file_json_object) {
-  $res = array();
-
-  foreach ($file_json_object as $key => $value) {
-    if ($key != 'checksums') {
-      $res[$key] = $value;
-      continue;
-    }
-
-    foreach ($value as $checksum_pair)
-      $res[$checksum_pair->type] = $checksum_pair->checksum;
-  }
-
-  return $res;
-}
-
-function user_insert_queue($user_fileset, $conn) {
-  $query = sprintf("INSERT INTO queue (time, notes, fileset, ticketid, userid, commit)
-  VALUES (%d, NULL, @fileset_last, NULL, NULL, NULL)", time());
-
-  $conn->query($query);
-}
-
-function user_insert_fileset($user_fileset, $ip, $conn) {
-  $src = 'user';
-  $detection = false;
-  $key = '';
-  $megakey = user_calc_key($user_fileset);
-  $transaction_id = $conn->query("SELECT MAX(`transaction`) FROM transactions")->fetch_array()[0] + 1;
-  $log_text = "from user submitted files";
-  $conn = db_connect();
-
-  // Set timestamp of fileset insertion
-  $conn->query(sprintf("SET @fileset_time_last = %d", time()));
-
-  if (insert_fileset($src, $detection, $key, $megakey, $transaction_id, $log_text, $conn, $ip)) {
-    foreach ($user_fileset as $file) {
-      $file = file_json_to_array($file);
-
-      insert_file($file, $detection, $src, $conn);
-      foreach ($file as $key => $value) {
-        if ($key != "name" && $key != "size")
-          insert_filechecksum($file, $key, $conn);
-      }
-    }
-  }
-
-  $fileset_id = $conn->query("SELECT @fileset_last")->fetch_array()[0];
-  $conn->commit();
-  return $fileset_id;
-}
-
-
-/**
- * (Attempt to) match fileset that have fileset.game as NULL
- * This will delete the original detection fileset and replace it with the newly
- * matched fileset
- */
-function match_and_merge_user_filesets($id) {
-  $conn = db_connect();
-
-  // Getting unmatched filesets
-  $unmatched_filesets = array();
-
-  $unmatched_files = $conn->query("SELECT fileset.id, filechecksum.checksum, src, status
-  FROM fileset
-  JOIN file ON file.fileset = fileset.id
-  JOIN filechecksum ON file.id = filechecksum.file
-  WHERE status = 'user' AND fileset.id = {$id}");
-  $unmatched_files = $unmatched_files->fetch_all();
-
-  // Splitting them into different filesets
-  for ($i = 0; $i < count($unmatched_files); $i++) {
-    $cur_fileset = $unmatched_files[$i][0];
-    $temp = array();
-    while ($i < count($unmatched_files) - 1 && $cur_fileset == $unmatched_files[$i][0]) {
-      array_push($temp, $unmatched_files[$i]);
-      $i++;
-    }
-    array_push($unmatched_filesets, $temp);
-  }
-
-  foreach ($unmatched_filesets as $fileset) {
-    $matching_games = find_matching_game($fileset);
-
-    if (count($matching_games) != 1) // If there is no match/non-unique match
-      continue;
-
-    $matched_game = $matching_games[0];
-
-    $status = 'fullmatch';
-
-    // Convert NULL values to string with value NULL for printing
-    $matched_game = array_map(function ($val) {
-      return (is_null($val)) ? "NULL" : $val;
-    }, $matched_game);
-
-    $category_text = "Matched from {$fileset[0][2]}";
-    $log_text = "Matched game {$matched_game['engineid']}:
-    {$matched_game['gameid']}-{$matched_game['platform']}-{$matched_game['language']}
-    variant {$matched_game['key']}. State {$status}. Fileset:{$fileset[0][0]}.";
-
-    // Updating the fileset.game value to be $matched_game["id"]
-    $query = sprintf("UPDATE fileset
-    SET game = %d, status = '%s', `key` = '%s'
-    WHERE id = %d", $matched_game["id"], $status, $matched_game["key"], $fileset[0][0]);
-
-    $history_last = merge_filesets($matched_game["fileset"], $fileset[0][0]);
-
-    if ($conn->query($query)) {
-      $user = 'cli:' . get_current_user();
-
-      // Merge log
-      create_log("Fileset merge", $user,
-        mysqli_real_escape_string($conn, "Merged Fileset:{$matched_game['fileset']} and Fileset:{$fileset[0][0]}"));
-
-      // Matching log
-      $log_last = create_log(mysqli_real_escape_string($conn, $category_text), $user,
-        mysqli_real_escape_string($conn, $log_text));
-
-      // Add log id to the history table
-      $conn->query("UPDATE history SET log = {$log_last} WHERE id = {$history_last}");
-    }
-
-    if (!$conn->commit())
-      echo "Updating matched games failed\n";
-  }
-}
-
-
-?>
-
diff --git a/index.php b/index.php
deleted file mode 100644
index f76a86a..0000000
--- a/index.php
+++ /dev/null
@@ -1,15 +0,0 @@
-<?php
-
-$request = $_SERVER['REQUEST_URI'];
-$api_root = '/endpoints/';
-
-switch ($request) {
-  case '':
-  case '/':
-    require __DIR__ . '/index.html';
-    break;
-
-  case '/api/validate':
-    require __DIR__ . $api_root . 'validate.php';
-}
-?>
diff --git a/logs.php b/logs.php
deleted file mode 100644
index 53b538a..0000000
--- a/logs.php
+++ /dev/null
@@ -1,20 +0,0 @@
-<?php
-require __DIR__ . '/include/pagination.php';
-
-$filename = "logs.php";
-$records_table = "log";
-$select_query = "SELECT id, `timestamp`, category, user, `text`
-FROM log";
-$order = "ORDER BY `timestamp` DESC, id DESC";
-
-$filters = array(
-  'id' => 'log',
-  'timestamp' => 'log',
-  'category' => 'log',
-  'user' => 'log',
-  'text' => 'log'
-);
-
-create_page($filename, 25, $records_table, $select_query, $order, $filters);
-?>
-
diff --git a/mod_actions.php b/mod_actions.php
deleted file mode 100644
index 845b285..0000000
--- a/mod_actions.php
+++ /dev/null
@@ -1,28 +0,0 @@
-<?php
-// require __DIR__ . '/include/pagination.php';
-
-$stylesheet = 'style.css';
-$jquery_file = 'https://code.jquery.com/jquery-3.7.0.min.js';
-$js_file = 'js_functions.js';
-echo "<link rel='stylesheet' href='{$stylesheet}'>\n";
-echo "<script type='text/javascript' src='{$jquery_file}'></script>\n";
-echo "<script type='text/javascript' src='{$js_file}'></script>\n";
-
-
-// Dev Tools
-echo "<h3>Developer Moderation Tools</h3>";
-echo "<button id='delete-button' type='button' onclick='delete_id(0)'>Delete filesets from last uploaded DAT</button>";
-echo "<br/>";
-echo "<button id='match-button' type='button' onclick='match_id(0)'>Merge Uploaded Fileset</button>";
-echo "<br/>";
-echo "<button id='match-button' type='button' onclick='match_id(0)'>Merge User Fileset</button>";
-
-if (isset($_POST['delete'])) {
-}
-if (isset($_POST['match'])) {
-  // merge_user_filesets();
-}
-
-echo "<p id='delete-confirm' class='hidden'>Fileset marked for deletion</p>"; // Hidden
-?>
-
diff --git a/user_games_list.php b/user_games_list.php
deleted file mode 100644
index e6706b4..0000000
--- a/user_games_list.php
+++ /dev/null
@@ -1,32 +0,0 @@
-<?php
-require __DIR__ . '/include/pagination.php';
-
-$filename = "user_games_list.php";
-$records_table = "game";
-$select_query = "SELECT engineid, gameid, extra, platform, language, game.name,
-status, fileset.id as fileset
-FROM fileset
-LEFT JOIN game ON game.id = fileset.game
-LEFT JOIN engine ON engine.id = game.engine
-WHERE status = 'user'";
-$order = "ORDER BY gameid";
-
-// Filter column => table
-$filters = array(
-  "engineid" => "engine",
-  "gameid" => "game",
-  "extra" => "game",
-  "platform" => "game",
-  "language" => "game",
-  "name" => "game",
-  "status" => "fileset"
-);
-
-$mapping = array(
-  'engine.id' => 'game.engine',
-  'game.id' => 'fileset.game',
-);
-
-create_page($filename, 200, $records_table, $select_query, $order, $filters, $mapping);
-?>
-


Commit: 55526bee15cc79b61aa3fff267dc86d3d60accdf
    https://github.com/scummvm/scummvm-sites/commit/55526bee15cc79b61aa3fff267dc86d3d60accdf
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-13T20:27:01+08:00

Commit Message:
INTEGRITY: Fix db_insert func

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index ca46900..6079eaa 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -266,12 +266,11 @@ def db_insert(data_arr):
         conn.cursor().execute("UPDATE fileset SET status = 'obsolete' WHERE `timestamp` != FROM_UNIXTIME(@fileset_time_last) AND status = 'detection'")
     cur = conn.cursor()
     
-    fileset_insertion_count = cur.fetchone()['COUNT(fileset)']
-    category_text = f"Uploaded from {src}"
-    log_text = f"Completed loading DAT file, filename '{filepath}', size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
-
     try:
         cur.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
+        fileset_insertion_count = cur.fetchone()['COUNT(fileset)']
+        category_text = f"Uploaded from {src}"
+        log_text = f"Completed loading DAT file, filename '{filepath}', size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
     except Exception as e:
         print("Inserting failed:", e)
     else:


Commit: 6a37e435acaa53d592c85e94bf4da8f0cc942cc4
    https://github.com/scummvm/scummvm-sites/commit/6a37e435acaa53d592c85e94bf4da8f0cc942cc4
Author: inariindream (inariindream at 163.com)
Date: 2024-06-13T20:28:36+08:00

Commit Message:
INTEGRITY: Add Megadata class

Changed paths:
  A megadata.py


diff --git a/megadata.py b/megadata.py
new file mode 100644
index 0000000..d489391
--- /dev/null
+++ b/megadata.py
@@ -0,0 +1,38 @@
+import os
+import time
+import compute_hash
+
+class Megadata:
+    def __init__(self, file_path):
+        self.file_path = file_path
+        self.hash = self.calculate_hash(file_path)
+        self.size = os.path.getsize(file_path)
+        self.creation_time = os.path.getctime(file_path)
+        self.modification_time = os.path.getmtime(file_path)
+
+    def calculate_hash(self, file_path):
+        pass
+
+    def __eq__(self, other):
+        return (self.hash == other.hash and
+                self.size == other.size and
+                self.creation_time == other.creation_time and
+                self.modification_time == other.modification_time)
+    
+
+def record_megadata(directory):
+    file_megadata = {}
+    for root, _, files in os.walk(directory):
+        for file in files:
+            file_path = os.path.join(root, file)
+            file_megadata[file_path] = Megadata(file_path)
+    return file_megadata
+
+def check_for_updates(old_megadata, current_directory):
+    current_megadata = record_megadata(current_directory)
+    updates = []
+    for old_path, old_data in old_megadata.items():
+        for current_path, current_data in current_megadata.items():
+            if old_data == current_data and old_path != current_path:
+                updates.append((old_path, current_path))
+    return updates
\ No newline at end of file


Commit: 6c2277df48926dc735c9dfc2263a628264568688
    https://github.com/scummvm/scummvm-sites/commit/6c2277df48926dc735c9dfc2263a628264568688
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-16T21:04:50+08:00

Commit Message:
INTEGRITY: Fix errors in dat_parser

Changed paths:
    dat_parser.py
    db_functions.py


diff --git a/dat_parser.py b/dat_parser.py
index a92d869..f01640d 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -11,17 +11,14 @@ def remove_quotes(string):
     return string
 
 def map_checksum_data(content_string):
-    arr = {}
-    temp = re.findall(r'("[^"]*")|\S+', content_string)
-
-    for i in range(1, len(temp), 2):
-        if i+1 < len(temp):
-            if temp[i] == ')' or temp[i] in ['crc', 'sha1']:
-                continue
-            temp[i + 1] = remove_quotes(temp[i + 1])
-            if temp[i + 1] == ')':
-                temp[i + 1] = ""
-            arr[temp[i]] = temp[i + 1].replace("\\", "")
+    arr = []
+    
+    rom_props = re.findall(r'(\w+)\s+"([^"]*)"\s+size\s+(\d+)\s+md5-5000\s+([a-f0-9]+)', content_string)
+
+    for prop in rom_props:
+        key, name, size, md5 = prop
+        item = {'name': name, 'size': int(size), 'md5-5000': md5}
+        arr.append(item)
 
     return arr
 
@@ -39,10 +36,9 @@ def map_key_values(content_string, arr):
 
         # Handle duplicate keys (if the key is rom) and add values to a array instead
         if pair[0] == "rom":
-            if pair[0] in arr:
-                arr[pair[0]].append(map_checksum_data(pair[1]))
-            else:
-                arr[pair[0]] = [map_checksum_data(pair[1])]
+            if 'rom' not in arr:
+                arr['rom'] = []
+            arr['rom'].extend(map_checksum_data(pair[1]))
         else:
             arr[pair[0]] = pair[1].replace("\\", "")
             
@@ -96,16 +92,16 @@ def parse_dat(dat_filepath):
         for data_segment in matches:
             if "clrmamepro" in content[data_segment[1] - 11: data_segment[1]] or \
                 "scummvm" in content[data_segment[1] - 8: data_segment[1]]:
-                map_key_values(data_segment[0], header)
+                header = map_key_values(data_segment[0], header)
             elif "game" in content[data_segment[1] - 5: data_segment[1]]:
                 temp = {}
-                map_key_values(data_segment[0], temp)
+                temp = map_key_values(data_segment[0], temp)
                 game_data.append(temp)
             elif "resource" in content[data_segment[1] - 9: data_segment[1]]:
                 temp = {}
-                map_key_values(data_segment[0], temp)
+                temp = map_key_values(data_segment[0], temp)
                 resources[temp["name"]] = temp
-
+    # print(header, game_data, resources)
     return header, game_data, resources, dat_filepath
 
 # Process command line args
diff --git a/db_functions.py b/db_functions.py
index 6079eaa..1c469e4 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -52,7 +52,7 @@ def insert_game(engine_name, engineid, title, gameid, extra, platform, lang, con
         res = cursor.fetchone()
         if res is not None:
             exists = True
-            cursor.execute(f"SET @engine_last = '{res[0]}'")
+            cursor.execute(f"SET @engine_last = '{res['id']}'")
 
     # Insert into table if not present
     if not exists:
@@ -102,7 +102,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
 
         with conn.cursor() as cursor:
             cursor.execute(f"UPDATE fileset SET `timestamp` = FROM_UNIXTIME(@fileset_time_last) WHERE id = {existing_entry}")
-            cursor.execute("UPDATE fileset SET status = 'detection' WHERE id = {existing_entry} AND status = 'obsolete'")
+            cursor.execute(f"UPDATE fileset SET status = 'detection' WHERE id = {existing_entry} AND status = 'obsolete'")
             cursor.execute("DELETE FROM game WHERE id = @game_last")
         return False
 


Commit: 5d2ebc4bcf8c7fff550e38b5edc45a6f3e9a3e34
    https://github.com/scummvm/scummvm-sites/commit/5d2ebc4bcf8c7fff550e38b5edc45a6f3e9a3e34
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-16T21:34:41+08:00

Commit Message:
INTEGRITY: Print num of pages

Changed paths:
    pagination.py


diff --git a/pagination.py b/pagination.py
index d43fefa..62cf06a 100644
--- a/pagination.py
+++ b/pagination.py
@@ -74,7 +74,7 @@ def create_page(filename, results_per_page, records_table, select_query, order,
             num_of_results = cursor.fetchone()['COUNT(id)']
             
         num_of_pages = (num_of_results + results_per_page - 1) // results_per_page
-
+        print(f"Num of results: {num_of_results}, Num of pages: {num_of_pages}")
         if num_of_results == 0:
             return "No results for given filters"
 


Commit: a5fd396eec88709b793ac9c954a216ace7a88ca1
    https://github.com/scummvm/scummvm-sites/commit/a5fd396eec88709b793ac9c954a216ace7a88ca1
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-16T21:35:54+08:00

Commit Message:
INTEGRITY: Add overriding via command line
Handle exceptions

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 1c469e4..e9d4e87 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -5,6 +5,7 @@ import getpass
 import time
 import hashlib
 import os
+import argparse
 from pymysql.converters import escape_string
 
 def db_connect():
@@ -83,7 +84,11 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
             cursor.execute(f"SELECT id FROM fileset WHERE megakey = {megakey}")
 
         existing_entry = cursor.fetchone()
-
+    
+    parser = argparse.ArgumentParser()
+    parser.add_argument("--user", help="override user")
+    global args
+    args = parser.parse_args()
     if existing_entry is not None:
         existing_entry = existing_entry['id']
         with conn.cursor() as cursor:
@@ -94,7 +99,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         if src == 'user':
             log_text = f"Duplicate of Fileset:{existing_entry}, from user IP {ip}, {log_text}"
 
-        user = f'cli:{getpass.getuser()}'
+        user = args.user if args.user else f'cli:{getpass.getuser()}'
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
         if not detection:
@@ -121,7 +126,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
     if src == 'user':
         log_text = f"Created Fileset:{fileset_last}, from user IP {ip}, {log_text}"
 
-    user = f'cli:{getpass.getuser()}'
+    user = args.user if args.user else f'cli:{getpass.getuser()}'
     create_log(escape_string(category_text), user, escape_string(log_text), conn)
     with conn.cursor() as cursor:
         cursor.execute(f"INSERT INTO transactions (`transaction`, fileset) VALUES ({transaction}, {fileset_last})")
@@ -214,26 +219,44 @@ def db_insert(data_arr):
     resources = data_arr[2]
     filepath = data_arr[3]
 
-    conn = db_connect()
+    try:
+        conn = db_connect()
+    except Exception as e:
+        print(f"Failed to connect to database: {e}")
+        return
 
-    author = header["author"]
-    version = header["version"]
+    try:
+        author = header["author"]
+        version = header["version"]
+    except KeyError as e:
+        print(f"Missing key in header: {e}")
+        return
 
     src = "dat" if author not in ["scan", "scummvm"] else author
 
     detection = (src == "scummvm")
     status = "detection" if detection else src
 
-    conn.cursor().execute(f"SET @fileset_time_last = {int(time.time())}")
+    try:
+        conn.cursor().execute(f"SET @fileset_time_last = {int(time.time())}")
+    except Exception as e:
+        print(f"Failed to execute query: {e}")
+        return
 
-    with conn.cursor() as cursor:
-        cursor.execute("SELECT MAX(`transaction`) FROM transactions")
-        transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
+    try:
+        with conn.cursor() as cursor:
+            cursor.execute("SELECT MAX(`transaction`) FROM transactions")
+            transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
+    except Exception as e:
+        print(f"Failed to execute query: {e}")
+    finally:
+        conn.close()
 
     category_text = f"Uploaded from {src}"
     log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Transaction: {transaction_id}"
 
-    user = f'cli:{getpass.getuser()}'
+    
+    user = args.user if args.user else f'cli:{getpass.getuser()}'
     create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
     for fileset in game_data:
@@ -274,7 +297,7 @@ def db_insert(data_arr):
     except Exception as e:
         print("Inserting failed:", e)
     else:
-        user = f'cli:{getpass.getuser()}'
+        user = args.user if args.user else f'cli:{getpass.getuser()}'
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
 def compare_filesets(id1, id2, conn):
@@ -441,7 +464,8 @@ def populate_matching_games():
         history_last = merge_filesets(matched_game["fileset"], fileset[0][0])
 
         if cursor.execute(query):
-            user = f'cli:{getpass.getuser()}'
+
+            user = args.user if args.user else f'cli:{getpass.getuser()}'
 
             # Merge log
             create_log("Fileset merge", user, escape_string(conn, f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0][0]}"))


Commit: 5f56ff397883d97397050108fbe3d713c4fd4ab6
    https://github.com/scummvm/scummvm-sites/commit/5f56ff397883d97397050108fbe3d713c4fd4ab6
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-16T21:36:16+08:00

Commit Message:
INTEGRITY: Print usage info

Changed paths:
    dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
index f01640d..8c8bcba 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -105,6 +105,10 @@ def parse_dat(dat_filepath):
     return header, game_data, resources, dat_filepath
 
 # Process command line args
+if len(sys.argv) == 1:
+    print("Usage: python dat_parser.py [--upload <filepaths>...] [--match]")
+    sys.exit(1)
+
 if "--upload" in sys.argv:
     index = sys.argv.index("--upload")
     for filepath in sys.argv[index + 1:]:
@@ -113,4 +117,4 @@ if "--upload" in sys.argv:
         db_insert(parse_dat(filepath))
 
 if "--match" in sys.argv:
-    populate_matching_games()
\ No newline at end of file
+    populate_matching_games()


Commit: 462cf99e4f0bf44cce817e46dd69e1224e8361dc
    https://github.com/scummvm/scummvm-sites/commit/462cf99e4f0bf44cce817e46dd69e1224e8361dc
Author: inariindream (inariindream at 163.com)
Date: 2024-06-17T15:24:19+08:00

Commit Message:
INTEGRITY: Refactoring DB operations

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index e9d4e87..2284016 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -7,6 +7,7 @@ import hashlib
 import os
 import argparse
 from pymysql.converters import escape_string
+from collections import defaultdict
 
 def db_connect():
     with open('mysql_config.json') as f:
@@ -325,62 +326,62 @@ def status_to_match(status):
     order = ["detection", "dat", "scan", "partialmatch", "fullmatch", "user"]
     return order[:order.index(status)]
 
-def find_matching_game(game_files):
-    matching_games = []  # All matching games
-    matching_filesets = []  # All filesets containing one file from game_files
-    matches_count = 0  # Number of files with a matching detection entry
-
-    conn = db_connect()
+def get_fileset_ids_with_matching_files(files, conn):
+    if not files:
+        return []
+    
+    checksums = [file['checksum'] for file in files if 'checksum' in file]
+    if not checksums:
+        return []
 
-    for file in game_files:
-        checksum = file[1]
+    placeholders = ', '.join(['%s'] * len(checksums))
 
-        query = f"SELECT file.fileset as file_fileset FROM filechecksum JOIN file ON filechecksum.file = file.id WHERE filechecksum.checksum = '{checksum}' AND file.detection = TRUE"
-        with conn.cursor() as cursor:
-            cursor.execute(query)
-            records = cursor.fetchall()
+    query = f"""
+    SELECT DISTINCT file.fileset
+    FROM file
+    WHERE file.checksum IN ({placeholders})
+    """
+    
+    with conn.cursor() as cursor:
+        cursor.execute(query, checksums)
+        result = cursor.fetchall()
 
-        # If file is not part of detection entries, skip it
-        if len(records) == 0:
-            continue
+    fileset_ids = [row['fileset'] for row in result]
 
-        matches_count += 1
-        for record in records:
-            matching_filesets.append(record[0])
+    return fileset_ids
 
-    # Check if there is a fileset_id that is present in all results
-    for key, value in Counter(matching_filesets).items():
-        with conn.cursor() as cursor:
-            cursor.execute(f"SELECT COUNT(file.id) FROM file JOIN fileset ON file.fileset = fileset.id WHERE fileset.id = '{key}'")
-            count_files_in_fileset = cursor.fetchone()['COUNT(file.id)']
+def find_matching_game(game_files):
+    matching_games = defaultdict(list)
 
-        # We use < instead of != since one file may have more than one entry in the fileset
-        # We see this in Drascula English version, where one entry is duplicated
-        if value < matches_count or value < count_files_in_fileset:
-            continue
+    conn = db_connect()
 
-        with conn.cursor() as cursor:
-            cursor.execute(f"SELECT engineid, game.id, gameid, platform, language, `key`, src, fileset.id as fileset FROM game JOIN fileset ON fileset.game = game.id JOIN engine ON engine.id = game.engine WHERE fileset.id = '{key}'")
-            records = cursor.fetchall()
+    fileset_ids = get_fileset_ids_with_matching_files(game_files, conn)
 
-        matching_games.append(records[0])
+    while len(fileset_ids) > 100:
+        game_files.pop()
+        fileset_ids = get_fileset_ids_with_matching_files(game_files, conn)
 
-    if len(matching_games) != 1:
+    if not fileset_ids:
         return matching_games
+    
+    placeholders = ', '.join(['%s'] * len(fileset_ids))
+
+    query = f"""
+    SELECT fileset.id AS fileset, engineid, game.id AS game_id, gameid, platform, language, `key`, src
+    FROM fileset
+    JOIN file ON file.fileset = fileset.id
+    JOIN game ON game.id = fileset.game
+    JOIN engine ON engine.id = game.engine
+    WHERE fileset.id IN ({placeholders})
+    """
 
-    # Check the current fileset priority with that of the match
     with conn.cursor() as cursor:
-        cursor.execute(f"SELECT id FROM fileset, ({query}) AS res WHERE id = file_fileset AND status IN ({', '.join(['%s']*len(game_files[3]))})", status_to_match(game_files[3]))
+        cursor.execute(query, fileset_ids)
         records = cursor.fetchall()
-
-    # If priority order is correct
-    if len(records) != 0:
-        return matching_games
-
-    if compare_filesets(matching_games[0]['fileset'], game_files[0][0], conn):
-        with conn.cursor() as cursor:
-            cursor.execute(f"UPDATE fileset SET `delete` = TRUE WHERE id = {game_files[0][0]}")
-        return []
+    
+    for record in records:
+        fileset_id = record['fileset']
+        matching_games[fileset_id].append(record)
 
     return matching_games
 
@@ -420,14 +421,19 @@ def merge_filesets(detection_id, dat_id):
 def populate_matching_games():
     conn = db_connect()
 
-    # Getting unmatched filesets
     unmatched_filesets = []
+    unmatched_files = []
 
     with conn.cursor() as cursor:
-        cursor.execute("SELECT fileset.id, filechecksum.checksum, src, status FROM fileset JOIN file ON file.fileset = fileset.id JOIN filechecksum ON file.id = filechecksum.file WHERE fileset.game IS NULL AND status != 'user'")
+        cursor.execute("""
+            SELECT fileset.id, filechecksum.checksum, src, status
+            FROM fileset
+            JOIN file ON file.fileset = fileset.id
+            JOIN filechecksum ON file.id = filechecksum.file
+            WHERE fileset.game IS NULL AND status != 'user'
+        """)
         unmatched_files = cursor.fetchall()
 
-    # Splitting them into different filesets
     i = 0
     while i < len(unmatched_files):
         cur_fileset = unmatched_files[i][0]
@@ -440,40 +446,33 @@ def populate_matching_games():
     for fileset in unmatched_filesets:
         matching_games = find_matching_game(fileset)
 
-        if len(matching_games) != 1: # If there is no match/non-unique match
+        if len(matching_games) != 1:  
             continue
 
-        matched_game = matching_games[0]
+        matched_game = matching_games[list(matching_games.keys())[0]][0]
 
-        # Update status depending on $matched_game["src"] (dat -> partialmatch, scan -> fullmatch)
         status = fileset[0][2]
         if fileset[0][2] == "dat":
             status = "partialmatch"
         elif fileset[0][2] == "scan":
             status = "fullmatch"
 
-        # Convert NULL values to string with value NULL for printing
         matched_game = {k: 'NULL' if v is None else v for k, v in matched_game.items()}
 
         category_text = f"Matched from {fileset[0][2]}"
         log_text = f"Matched game {matched_game['engineid']}:\n{matched_game['gameid']}-{matched_game['platform']}-{matched_game['language']}\nvariant {matched_game['key']}. State {status}. Fileset:{fileset[0][0]}."
 
-        # Updating the fileset.game value to be $matched_game["id"]
-        query = f"UPDATE fileset SET game = {matched_game['id']}, status = '{status}', `key` = '{matched_game['key']}' WHERE id = {fileset[0][0]}"
+        query = f"UPDATE fileset SET game = {matched_game['game_id']}, status = '{status}', `key` = '{matched_game['key']}' WHERE id = {fileset[0][0]}"
 
         history_last = merge_filesets(matched_game["fileset"], fileset[0][0])
 
         if cursor.execute(query):
-
             user = args.user if args.user else f'cli:{getpass.getuser()}'
 
-            # Merge log
-            create_log("Fileset merge", user, escape_string(conn, f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0][0]}"))
+            create_log("Fileset merge", user, escape_string(f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0][0]}"), conn)
 
-            # Matching log
-            log_last = create_log(escape_string(conn, category_text), user, escape_string(conn, log_text))
+            log_last = create_log(escape_string(f"{category_text}"), user, escape_string(f"{log_text}"), conn)
 
-            # Add log id to the history table
             cursor.execute(f"UPDATE history SET log = {log_last} WHERE id = {history_last}")
 
         try:


Commit: 0c008131793343d6ba71cd9fcde15e39710ac6a4
    https://github.com/scummvm/scummvm-sites/commit/0c008131793343d6ba71cd9fcde15e39710ac6a4
Author: inariindream (inariindream at 163.com)
Date: 2024-06-17T15:33:38+08:00

Commit Message:
INTEGRITY: Use calc_megakey

Changed paths:
    user_fileset_functions.py


diff --git a/user_fileset_functions.py b/user_fileset_functions.py
index 88d4a83..871c6f5 100644
--- a/user_fileset_functions.py
+++ b/user_fileset_functions.py
@@ -1,6 +1,6 @@
 import hashlib
 import time
-from db_functions import db_connect, insert_fileset, insert_file, insert_filechecksum, find_matching_game, merge_filesets, create_log
+from db_functions import db_connect, insert_fileset, insert_file, insert_filechecksum, find_matching_game, merge_filesets, create_log, calc_megakey
 import getpass
 import pymysql
 
@@ -37,7 +37,7 @@ def user_insert_fileset(user_fileset, ip, conn):
     src = 'user'
     detection = False
     key = ''
-    megakey = user_calc_key(user_fileset)
+    megakey = calc_megakey(user_fileset)
     with conn.cursor() as cursor:
         cursor.execute("SELECT MAX(`transaction`) FROM transactions")
         transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1


Commit: 369fa216a05e146eef7110544e0a9262f4153e89
    https://github.com/scummvm/scummvm-sites/commit/369fa216a05e146eef7110544e0a9262f4153e89
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-17T21:16:31+08:00

Commit Message:
INTEGRITY: Create merge button and merge page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 14637fb..3b00020 100644
--- a/fileset.py
+++ b/fileset.py
@@ -169,6 +169,9 @@ def fileset():
                 for key, value in row.items():
                     if key != 'id':
                         html += f"<td>{value}</td>\n"
+                    if key == 'detection':
+                        if value == 1:
+                            html += f"<td><button onclick=\"location.href='/fileset/{row['id']}/merge'\">Merge</button></td>"
                 html += "</tr>\n"
                 counter += 1
             html += "</table>\n"
@@ -213,6 +216,184 @@ def fileset():
             return render_template_string(html)
     finally:
         connection.close()
+        
+ at app.route('/fileset/<int:id>/merge', methods=['GET', 'POST'])
+def merge_fileset(id):
+    if request.method == 'POST':
+        search_query = request.form['search']
+        
+        with open('mysql_config.json') as f:
+            mysql_cred = json.load(f)
+
+        connection = pymysql.connect(
+            host=mysql_cred["servername"],
+            user=mysql_cred["username"],
+            password=mysql_cred["password"],
+            db=mysql_cred["dbname"],
+            charset='utf8mb4',
+            cursorclass=pymysql.cursors.DictCursor
+        )
+
+        try:
+            with connection.cursor() as cursor:
+                query = f"""
+                SELECT id, gameid, platform, language
+                FROM fileset
+                WHERE gameid LIKE '%{search_query}%' OR platform LIKE '%{search_query}%' OR language LIKE '%{search_query}%'
+                """
+                cursor.execute(query)
+                results = cursor.fetchall()
+
+                html = f"""
+                <!DOCTYPE html>
+                <html>
+                <head>
+                    <link rel="stylesheet" type="text/css" href="{{{{ url_for('static', filename='style.css') }}}}">
+                </head>
+                <body>
+                <h2>Search Results for '{search_query}'</h2>
+                <form method="POST">
+                    <input type="text" name="search" placeholder="Search fileset">
+                    <input type="submit" value="Search">
+                </form>
+                <table>
+                <tr><th>ID</th><th>Game ID</th><th>Platform</th><th>Language</th><th>Action</th></tr>
+                """
+                for result in results:
+                    html += f"""
+                    <tr>
+                        <td>{result['id']}</td>
+                        <td>{result['gameid']}</td>
+                        <td>{result['platform']}</td>
+                        <td>{result['language']}</td>
+                        <td><a href="/fileset/{id}/merge/confirm?target_id={result['id']}">Select</a></td>
+                    </tr>
+                    """
+                html += "</table>\n"
+                html += "</body>\n</html>"
+
+                return render_template_string(html)
+
+        finally:
+            connection.close()
+
+    return '''
+    <!DOCTYPE html>
+    <html>
+    <head>
+        <link rel="stylesheet" type="text/css" href="{{ url_for('static', filename='style.css') }}">
+    </head>
+    <body>
+    <h2>Search Fileset to Merge</h2>
+    <form method="POST">
+        <input type="text" name="search" placeholder="Search fileset">
+        <input type="submit" value="Search">
+    </form>
+    </body>
+    </html>
+    '''
+    
+ at app.route('/fileset/<int:id>/merge/confirm', methods=['GET', 'POST'])
+def confirm_merge(id):
+    target_id = request.args.get('target_id', type=int)
+
+    with open('mysql_config.json') as f:
+        mysql_cred = json.load(f)
+
+    connection = pymysql.connect(
+        host=mysql_cred["servername"],
+        user=mysql_cred["username"],
+        password=mysql_cred["password"],
+        db=mysql_cred["dbname"],
+        charset='utf8mb4',
+        cursorclass=pymysql.cursors.DictCursor
+    )
+
+    try:
+        with connection.cursor() as cursor:
+            cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
+            source_fileset = cursor.fetchone()
+
+            cursor.execute(f"SELECT * FROM fileset WHERE id = {target_id}")
+            target_fileset = cursor.fetchone()
+
+            html = """
+            <!DOCTYPE html>
+            <html>
+            <head>
+                <link rel="stylesheet" type="text/css" href="{{ url_for('static', filename='style.css') }}">
+            </head>
+            <body>
+            <h2>Confirm Merge</h2>
+            <table>
+            <tr><th>Field</th><th>Source Fileset</th><th>Target Fileset</th></tr>
+            """
+
+            for column in source_fileset.keys():
+                if column != 'id':
+                    html += f"<tr><td>{column}</td><td>{source_fileset[column]}</td><td>{target_fileset[column]}</td></tr>"
+
+            html += """
+            </table>
+            <form method="POST" action="{{ url_for('execute_merge', id=id) }}">
+                <input type="hidden" name="source_id" value="{source_fileset['id']}">
+                <input type="hidden" name="target_id" value="{target_fileset['id']}">
+                <input type="submit" value="Confirm Merge">
+            </form>
+            <form action="/fileset/{id}">
+                <input type="submit" value="Cancel">
+            </form>
+            </body>
+            </html>
+            """
+            return render_template_string(html)
+
+    finally:
+        connection.close()
+
+ at app.route('/fileset/<int:id>/merge/execute', methods=['POST'])
+def execute_merge(id):
+    source_id = request.form['source_id']
+    target_id = request.form['target_id']
+
+    with open('mysql_config.json') as f:
+        mysql_cred = json.load(f)
+
+    connection = pymysql.connect(
+        host=mysql_cred["servername"],
+        user=mysql_cred["username"],
+        password=mysql_cred["password"],
+        db=mysql_cred["dbname"],
+        charset='utf8mb4',
+        cursorclass=pymysql.cursors.DictCursor
+    )
+
+    try:
+        with connection.cursor() as cursor:
+            cursor.execute(f"SELECT * FROM fileset WHERE id = {source_id}")
+            source_fileset = cursor.fetchone()
+
+            cursor.execute(f"""
+            UPDATE fileset SET
+                game = '{source_fileset['game']}',
+                status = '{source_fileset['status']}',
+                `key` = '{source_fileset['key']}',
+                megakey = '{source_fileset['megakey']}',
+                `timestamp` = '{source_fileset['timestamp']}'
+            WHERE id = {target_id}
+            """)
+
+            cursor.execute(f"""
+            INSERT INTO history (`timestamp`, fileset, oldfileset, log)
+            VALUES (NOW(), {target_id}, {source_id}, 'Merged fileset {source_id} into {target_id}')
+            """)
+
+            connection.commit()
+
+            return redirect(url_for('fileset', id=target_id))
+
+    finally:
+        connection.close()
 
 @app.route('/validate', methods=['POST'])
 def validate():


Commit: c3c5c3c57983661992cc177272c31188f06aa110
    https://github.com/scummvm/scummvm-sites/commit/c3c5c3c57983661992cc177272c31188f06aa110
Author: inariindream (inariindream at 163.com)
Date: 2024-06-18T14:33:07+08:00

Commit Message:
INTEGRITY: Fix merge page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 3b00020..ecb5aee 100644
--- a/fileset.py
+++ b/fileset.py
@@ -237,9 +237,10 @@ def merge_fileset(id):
         try:
             with connection.cursor() as cursor:
                 query = f"""
-                SELECT id, gameid, platform, language
+                SELECT fileset.id, game.id AS game_id, platform, language
                 FROM fileset
-                WHERE gameid LIKE '%{search_query}%' OR platform LIKE '%{search_query}%' OR language LIKE '%{search_query}%'
+                JOIN game ON game.id = fileset.id
+                WHERE game.id LIKE '%{search_query}%' OR platform LIKE '%{search_query}%' OR language LIKE '%{search_query}%'
                 """
                 cursor.execute(query)
                 results = cursor.fetchall()
@@ -263,7 +264,7 @@ def merge_fileset(id):
                     html += f"""
                     <tr>
                         <td>{result['id']}</td>
-                        <td>{result['gameid']}</td>
+                        <td>{result['game_id']}</td>
                         <td>{result['platform']}</td>
                         <td>{result['language']}</td>
                         <td><a href="/fileset/{id}/merge/confirm?target_id={result['id']}">Select</a></td>
@@ -328,7 +329,6 @@ def confirm_merge(id):
             <table>
             <tr><th>Field</th><th>Source Fileset</th><th>Target Fileset</th></tr>
             """
-
             for column in source_fileset.keys():
                 if column != 'id':
                     html += f"<tr><td>{column}</td><td>{source_fileset[column]}</td><td>{target_fileset[column]}</td></tr>"


Commit: 0233dbc831a2ba1fdd6496126f56f21fcb4a668c
    https://github.com/scummvm/scummvm-sites/commit/0233dbc831a2ba1fdd6496126f56f21fcb4a668c
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-18T20:03:21+08:00

Commit Message:
INTEGRITY: Fix SQL of the merge page

Changed paths:
    db_functions.py
    fileset.py


diff --git a/db_functions.py b/db_functions.py
index 2284016..3f6ca89 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -254,7 +254,7 @@ def db_insert(data_arr):
         conn.close()
 
     category_text = f"Uploaded from {src}"
-    log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Transaction: {transaction_id}"
+    log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}. Transaction: {transaction_id}"
 
     
     user = args.user if args.user else f'cli:{getpass.getuser()}'
diff --git a/fileset.py b/fileset.py
index ecb5aee..3468a4d 100644
--- a/fileset.py
+++ b/fileset.py
@@ -171,7 +171,7 @@ def fileset():
                         html += f"<td>{value}</td>\n"
                     if key == 'detection':
                         if value == 1:
-                            html += f"<td><button onclick=\"location.href='/fileset/{row['id']}/merge'\">Merge</button></td>"
+                            html += f"<td><button onclick=\"location.href='/fileset/{id}/merge'\">Merge</button></td>"
                 html += "</tr>\n"
                 counter += 1
             html += "</table>\n"
@@ -237,10 +237,10 @@ def merge_fileset(id):
         try:
             with connection.cursor() as cursor:
                 query = f"""
-                SELECT fileset.id, game.id AS game_id, platform, language
+                SELECT fileset.id, game.id AS game_id, platform, language, game.name
                 FROM fileset
                 JOIN game ON game.id = fileset.id
-                WHERE game.id LIKE '%{search_query}%' OR platform LIKE '%{search_query}%' OR language LIKE '%{search_query}%'
+                WHERE game.name LIKE '%{search_query}%' OR platform LIKE '%{search_query}%' OR language LIKE '%{search_query}%'
                 """
                 cursor.execute(query)
                 results = cursor.fetchall()
@@ -258,13 +258,13 @@ def merge_fileset(id):
                     <input type="submit" value="Search">
                 </form>
                 <table>
-                <tr><th>ID</th><th>Game ID</th><th>Platform</th><th>Language</th><th>Action</th></tr>
+                <tr><th>ID</th><th>Game Name</th><th>Platform</th><th>Language</th><th>Action</th></tr>
                 """
                 for result in results:
                     html += f"""
                     <tr>
                         <td>{result['id']}</td>
-                        <td>{result['game_id']}</td>
+                        <td>{result['name']}</td>
                         <td>{result['platform']}</td>
                         <td>{result['language']}</td>
                         <td><a href="/fileset/{id}/merge/confirm?target_id={result['id']}">Select</a></td>
@@ -312,10 +312,10 @@ def confirm_merge(id):
 
     try:
         with connection.cursor() as cursor:
-            cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
+            cursor.execute(f"SELECT * FROM game WHERE id = {id}")
             source_fileset = cursor.fetchone()
 
-            cursor.execute(f"SELECT * FROM fileset WHERE id = {target_id}")
+            cursor.execute(f"SELECT * FROM game WHERE id = {target_id}")
             target_fileset = cursor.fetchone()
 
             html = """
@@ -333,7 +333,7 @@ def confirm_merge(id):
                 if column != 'id':
                     html += f"<tr><td>{column}</td><td>{source_fileset[column]}</td><td>{target_fileset[column]}</td></tr>"
 
-            html += """
+            html += f"""
             </table>
             <form method="POST" action="{{ url_for('execute_merge', id=id) }}">
                 <input type="hidden" name="source_id" value="{source_fileset['id']}">


Commit: a855acefe3b3f05305fa33e9871a6b0b0188aff0
    https://github.com/scummvm/scummvm-sites/commit/a855acefe3b3f05305fa33e9871a6b0b0188aff0
Author: inariindream (inariindream at 163.com)
Date: 2024-06-19T15:35:15+08:00

Commit Message:
INTEGRITY: Revert changes of user overriding

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 3f6ca89..291bf43 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -5,7 +5,6 @@ import getpass
 import time
 import hashlib
 import os
-import argparse
 from pymysql.converters import escape_string
 from collections import defaultdict
 
@@ -85,11 +84,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
             cursor.execute(f"SELECT id FROM fileset WHERE megakey = {megakey}")
 
         existing_entry = cursor.fetchone()
-    
-    parser = argparse.ArgumentParser()
-    parser.add_argument("--user", help="override user")
-    global args
-    args = parser.parse_args()
+
     if existing_entry is not None:
         existing_entry = existing_entry['id']
         with conn.cursor() as cursor:
@@ -100,7 +95,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         if src == 'user':
             log_text = f"Duplicate of Fileset:{existing_entry}, from user IP {ip}, {log_text}"
 
-        user = args.user if args.user else f'cli:{getpass.getuser()}'
+        user = f'cli:{getpass.getuser()}'
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
         if not detection:
@@ -127,7 +122,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
     if src == 'user':
         log_text = f"Created Fileset:{fileset_last}, from user IP {ip}, {log_text}"
 
-    user = args.user if args.user else f'cli:{getpass.getuser()}'
+    user = f'cli:{getpass.getuser()}'
     create_log(escape_string(category_text), user, escape_string(log_text), conn)
     with conn.cursor() as cursor:
         cursor.execute(f"INSERT INTO transactions (`transaction`, fileset) VALUES ({transaction}, {fileset_last})")
@@ -256,8 +251,7 @@ def db_insert(data_arr):
     category_text = f"Uploaded from {src}"
     log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}. Transaction: {transaction_id}"
 
-    
-    user = args.user if args.user else f'cli:{getpass.getuser()}'
+    user = f'cli:{getpass.getuser()}'
     create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
     for fileset in game_data:
@@ -298,7 +292,7 @@ def db_insert(data_arr):
     except Exception as e:
         print("Inserting failed:", e)
     else:
-        user = args.user if args.user else f'cli:{getpass.getuser()}'
+        user = f'cli:{getpass.getuser()}'
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
 def compare_filesets(id1, id2, conn):
@@ -467,7 +461,7 @@ def populate_matching_games():
         history_last = merge_filesets(matched_game["fileset"], fileset[0][0])
 
         if cursor.execute(query):
-            user = args.user if args.user else f'cli:{getpass.getuser()}'
+            user = f'cli:{getpass.getuser()}'
 
             create_log("Fileset merge", user, escape_string(f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0][0]}"), conn)
 


Commit: 899f21b6e69831d148d976b00b79cebe8e9f19d6
    https://github.com/scummvm/scummvm-sites/commit/899f21b6e69831d148d976b00b79cebe8e9f19d6
Author: inariindream (inariindream at 163.com)
Date: 2024-06-19T16:13:31+08:00

Commit Message:
INTEGRITY: Fix Encoding of dat parser

Changed paths:
    dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
index 8c8bcba..b424334 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -80,7 +80,7 @@ def parse_dat(dat_filepath):
         print("File not readable")
         return
 
-    with open(dat_filepath, "r") as dat_file:
+    with open(dat_filepath, "r", encoding="utf-8") as dat_file:
         content = dat_file.read()
 
     header = {}


Commit: 0689208cdb2e00d541f1e16f4af5a4f3b331f75c
    https://github.com/scummvm/scummvm-sites/commit/0689208cdb2e00d541f1e16f4af5a4f3b331f75c
Author: inariindream (inariindream at 163.com)
Date: 2024-06-19T16:14:10+08:00

Commit Message:
INTEGRITY: Revert incorrect DB operations

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 291bf43..b02a5f2 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -6,7 +6,6 @@ import time
 import hashlib
 import os
 from pymysql.converters import escape_string
-from collections import defaultdict
 
 def db_connect():
     with open('mysql_config.json') as f:
@@ -233,20 +232,11 @@ def db_insert(data_arr):
     detection = (src == "scummvm")
     status = "detection" if detection else src
 
-    try:
-        conn.cursor().execute(f"SET @fileset_time_last = {int(time.time())}")
-    except Exception as e:
-        print(f"Failed to execute query: {e}")
-        return
+    conn.cursor().execute(f"SET @fileset_time_last = {int(time.time())}")
 
-    try:
-        with conn.cursor() as cursor:
-            cursor.execute("SELECT MAX(`transaction`) FROM transactions")
-            transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
-    except Exception as e:
-        print(f"Failed to execute query: {e}")
-    finally:
-        conn.close()
+    with conn.cursor() as cursor:
+        cursor.execute("SELECT MAX(`transaction`) FROM transactions")
+        transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
 
     category_text = f"Uploaded from {src}"
     log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}. Transaction: {transaction_id}"
@@ -320,62 +310,62 @@ def status_to_match(status):
     order = ["detection", "dat", "scan", "partialmatch", "fullmatch", "user"]
     return order[:order.index(status)]
 
-def get_fileset_ids_with_matching_files(files, conn):
-    if not files:
-        return []
-    
-    checksums = [file['checksum'] for file in files if 'checksum' in file]
-    if not checksums:
-        return []
+def find_matching_game(game_files):
+    matching_games = []  # All matching games
+    matching_filesets = []  # All filesets containing one file from game_files
+    matches_count = 0  # Number of files with a matching detection entry
 
-    placeholders = ', '.join(['%s'] * len(checksums))
+    conn = db_connect()
 
-    query = f"""
-    SELECT DISTINCT file.fileset
-    FROM file
-    WHERE file.checksum IN ({placeholders})
-    """
-    
-    with conn.cursor() as cursor:
-        cursor.execute(query, checksums)
-        result = cursor.fetchall()
+    for file in game_files:
+        checksum = file[1]
 
-    fileset_ids = [row['fileset'] for row in result]
+        query = f"SELECT file.fileset as file_fileset FROM filechecksum JOIN file ON filechecksum.file = file.id WHERE filechecksum.checksum = '{checksum}' AND file.detection = TRUE"
+        with conn.cursor() as cursor:
+            cursor.execute(query)
+            records = cursor.fetchall()
 
-    return fileset_ids
+        # If file is not part of detection entries, skip it
+        if len(records) == 0:
+            continue
 
-def find_matching_game(game_files):
-    matching_games = defaultdict(list)
+        matches_count += 1
+        for record in records:
+            matching_filesets.append(record[0])
 
-    conn = db_connect()
+    # Check if there is a fileset_id that is present in all results
+    for key, value in Counter(matching_filesets).items():
+        with conn.cursor() as cursor:
+            cursor.execute(f"SELECT COUNT(file.id) FROM file JOIN fileset ON file.fileset = fileset.id WHERE fileset.id = '{key}'")
+            count_files_in_fileset = cursor.fetchone()['COUNT(file.id)']
 
-    fileset_ids = get_fileset_ids_with_matching_files(game_files, conn)
+        # We use < instead of != since one file may have more than one entry in the fileset
+        # We see this in Drascula English version, where one entry is duplicated
+        if value < matches_count or value < count_files_in_fileset:
+            continue
 
-    while len(fileset_ids) > 100:
-        game_files.pop()
-        fileset_ids = get_fileset_ids_with_matching_files(game_files, conn)
+        with conn.cursor() as cursor:
+            cursor.execute(f"SELECT engineid, game.id, gameid, platform, language, `key`, src, fileset.id as fileset FROM game JOIN fileset ON fileset.game = game.id JOIN engine ON engine.id = game.engine WHERE fileset.id = '{key}'")
+            records = cursor.fetchall()
 
-    if not fileset_ids:
-        return matching_games
-    
-    placeholders = ', '.join(['%s'] * len(fileset_ids))
+        matching_games.append(records[0])
 
-    query = f"""
-    SELECT fileset.id AS fileset, engineid, game.id AS game_id, gameid, platform, language, `key`, src
-    FROM fileset
-    JOIN file ON file.fileset = fileset.id
-    JOIN game ON game.id = fileset.game
-    JOIN engine ON engine.id = game.engine
-    WHERE fileset.id IN ({placeholders})
-    """
+    if len(matching_games) != 1:
+        return matching_games
 
+    # Check the current fileset priority with that of the match
     with conn.cursor() as cursor:
-        cursor.execute(query, fileset_ids)
+        cursor.execute(f"SELECT id FROM fileset, ({query}) AS res WHERE id = file_fileset AND status IN ({', '.join(['%s']*len(game_files[3]))})", status_to_match(game_files[3]))
         records = cursor.fetchall()
-    
-    for record in records:
-        fileset_id = record['fileset']
-        matching_games[fileset_id].append(record)
+
+    # If priority order is correct
+    if len(records) != 0:
+        return matching_games
+
+    if compare_filesets(matching_games[0]['fileset'], game_files[0][0], conn):
+        with conn.cursor() as cursor:
+            cursor.execute(f"UPDATE fileset SET `delete` = TRUE WHERE id = {game_files[0][0]}")
+        return []
 
     return matching_games
 
@@ -415,19 +405,14 @@ def merge_filesets(detection_id, dat_id):
 def populate_matching_games():
     conn = db_connect()
 
+    # Getting unmatched filesets
     unmatched_filesets = []
-    unmatched_files = []
 
     with conn.cursor() as cursor:
-        cursor.execute("""
-            SELECT fileset.id, filechecksum.checksum, src, status
-            FROM fileset
-            JOIN file ON file.fileset = fileset.id
-            JOIN filechecksum ON file.id = filechecksum.file
-            WHERE fileset.game IS NULL AND status != 'user'
-        """)
+        cursor.execute("SELECT fileset.id, filechecksum.checksum, src, status FROM fileset JOIN file ON file.fileset = fileset.id JOIN filechecksum ON file.id = filechecksum.file WHERE fileset.game IS NULL AND status != 'user'")
         unmatched_files = cursor.fetchall()
 
+    # Splitting them into different filesets
     i = 0
     while i < len(unmatched_files):
         cur_fileset = unmatched_files[i][0]
@@ -440,23 +425,26 @@ def populate_matching_games():
     for fileset in unmatched_filesets:
         matching_games = find_matching_game(fileset)
 
-        if len(matching_games) != 1:  
+        if len(matching_games) != 1: # If there is no match/non-unique match
             continue
 
-        matched_game = matching_games[list(matching_games.keys())[0]][0]
+        matched_game = matching_games[0]
 
+        # Update status depending on $matched_game["src"] (dat -> partialmatch, scan -> fullmatch)
         status = fileset[0][2]
         if fileset[0][2] == "dat":
             status = "partialmatch"
         elif fileset[0][2] == "scan":
             status = "fullmatch"
 
+        # Convert NULL values to string with value NULL for printing
         matched_game = {k: 'NULL' if v is None else v for k, v in matched_game.items()}
 
         category_text = f"Matched from {fileset[0][2]}"
         log_text = f"Matched game {matched_game['engineid']}:\n{matched_game['gameid']}-{matched_game['platform']}-{matched_game['language']}\nvariant {matched_game['key']}. State {status}. Fileset:{fileset[0][0]}."
 
-        query = f"UPDATE fileset SET game = {matched_game['game_id']}, status = '{status}', `key` = '{matched_game['key']}' WHERE id = {fileset[0][0]}"
+        # Updating the fileset.game value to be $matched_game["id"]
+        query = f"UPDATE fileset SET game = {matched_game['id']}, status = '{status}', `key` = '{matched_game['key']}' WHERE id = {fileset[0][0]}"
 
         history_last = merge_filesets(matched_game["fileset"], fileset[0][0])
 
@@ -465,8 +453,10 @@ def populate_matching_games():
 
             create_log("Fileset merge", user, escape_string(f"Merged Fileset:{matched_game['fileset']} and Fileset:{fileset[0][0]}"), conn)
 
-            log_last = create_log(escape_string(f"{category_text}"), user, escape_string(f"{log_text}"), conn)
+            # Matching log
+            log_last = create_log(escape_string(conn, category_text), user, escape_string(conn, log_text))
 
+            # Add log id to the history table
             cursor.execute(f"UPDATE history SET log = {log_last} WHERE id = {history_last}")
 
         try:


Commit: 71835b0ef6efc4064e250b9cdcba9213b0f1a42e
    https://github.com/scummvm/scummvm-sites/commit/71835b0ef6efc4064e250b9cdcba9213b0f1a42e
Author: inariindream (inariindream at 163.com)
Date: 2024-06-19T16:15:05+08:00

Commit Message:
INTEGRITY: Fix merge confirm and merge execute page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 3468a4d..0176b2d 100644
--- a/fileset.py
+++ b/fileset.py
@@ -312,10 +312,13 @@ def confirm_merge(id):
 
     try:
         with connection.cursor() as cursor:
-            cursor.execute(f"SELECT * FROM game WHERE id = {id}")
+            cursor.execute(f"""SELECT * FROM fileset JOIN game ON game.id = fileset.id WHERE fileset.id = {id}
+            """)
             source_fileset = cursor.fetchone()
 
-            cursor.execute(f"SELECT * FROM game WHERE id = {target_id}")
+            cursor.execute(f"""SELECT * FROM fileset JOIN game ON game.id = fileset.id WHERE fileset.id = {target_id}
+                            
+            """)
             target_fileset = cursor.fetchone()
 
             html = """
@@ -330,23 +333,24 @@ def confirm_merge(id):
             <tr><th>Field</th><th>Source Fileset</th><th>Target Fileset</th></tr>
             """
             for column in source_fileset.keys():
+                print(column)
                 if column != 'id':
                     html += f"<tr><td>{column}</td><td>{source_fileset[column]}</td><td>{target_fileset[column]}</td></tr>"
 
-            html += f"""
+            html += """
             </table>
             <form method="POST" action="{{ url_for('execute_merge', id=id) }}">
-                <input type="hidden" name="source_id" value="{source_fileset['id']}">
-                <input type="hidden" name="target_id" value="{target_fileset['id']}">
+                <input type="hidden" name="source_id" value="{{ source_fileset['id'] }}">
+                <input type="hidden" name="target_id" value="{{ target_fileset['id'] }}">
                 <input type="submit" value="Confirm Merge">
             </form>
-            <form action="/fileset/{id}">
+            <form action="{{ url_for('fileset', id=id) }}">
                 <input type="submit" value="Cancel">
             </form>
             </body>
             </html>
             """
-            return render_template_string(html)
+            return render_template_string(html, source_fileset=source_fileset, target_fileset=target_fileset, id=id)
 
     finally:
         connection.close()
@@ -382,10 +386,10 @@ def execute_merge(id):
                 `timestamp` = '{source_fileset['timestamp']}'
             WHERE id = {target_id}
             """)
-
+                
             cursor.execute(f"""
             INSERT INTO history (`timestamp`, fileset, oldfileset, log)
-            VALUES (NOW(), {target_id}, {source_id}, 'Merged fileset {source_id} into {target_id}')
+            VALUES (NOW(), {target_id}, {source_id}, {1})
             """)
 
             connection.commit()


Commit: ef8785aca2d6661e63e7ee5cbf0a70c3316e2b98
    https://github.com/scummvm/scummvm-sites/commit/ef8785aca2d6661e63e7ee5cbf0a70c3316e2b98
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-21T20:43:02+08:00

Commit Message:
INTEGRITY: Add 'platform' and 'language' to megakey's calc

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index b02a5f2..a59a8d0 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -77,7 +77,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
 
     # Check if key/megakey already exists, if so, skip insertion (no quotes on purpose)
     with conn.cursor() as cursor:
-        if detection:
+        if not detection:
             cursor.execute(f"SELECT id FROM fileset WHERE `key` = {key}")
         else:
             cursor.execute(f"SELECT id FROM fileset WHERE megakey = {megakey}")
@@ -198,10 +198,9 @@ def calc_key(fileset):
     key_string = key_string.strip(':')
     return hashlib.md5(key_string.encode()).hexdigest()
 
-def calc_megakey(files):
-    key_string = ""
-
-    for file in files:
+def calc_megakey(fileset):
+    key_string = f":{fileset['platform']}:{fileset['language']}"
+    for file in fileset['rom']:
         for key, value in file.items():
             key_string += ':' + str(value)
 
@@ -260,7 +259,8 @@ def db_insert(data_arr):
                 fileset["rom"] = fileset["rom"] + resources[fileset["romof"]]["rom"]
 
         key = calc_key(fileset) if detection else ""
-        megakey = calc_megakey(fileset['rom']) if not detection else ""
+        megakey = calc_megakey(fileset) if detection else ""
+        print(key, megakey)
         log_text = f"size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'."
 
         if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn):


Commit: 8b4c27045656470d860dfd095af93611f575820e
    https://github.com/scummvm/scummvm-sites/commit/8b4c27045656470d860dfd095af93611f575820e
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-21T20:43:34+08:00

Commit Message:
INTEGRITY: Improve the query of fileset page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 0176b2d..c0b343a 100644
--- a/fileset.py
+++ b/fileset.py
@@ -101,10 +101,11 @@ def fileset():
         <h3>Fileset details</h3>
         <table>
         """
+            html += f"<td><button onclick=\"location.href='/fileset/{id}/merge'\">Merge</button></td>"
 
             cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
             result = cursor.fetchone()
-
+            print(result)
             html += "<h3>Fileset details</h3>"
             html += "<table>\n"
             if result['game']:
@@ -133,10 +134,10 @@ def fileset():
                 if k != 'widetable':
                     html += f"<input type='hidden' name='{k}' value='{v}'>"
             if widetable == 'true':
-                html += "<input class='hidden' type='text' name='widetable' value='false' />"
+                html += "<input class='hidden' type='text' name='widetable' value='true' />"
                 html += "<input type='submit' value='Hide extra checksums' />"
             else:
-                html += "<input class='hidden' type='text' name='widetable' value='true' />"
+                html += "<input class='hidden' type='text' name='widetable' value='false' />"
                 html += "<input type='submit' value='Expand Table' />"
             html += "</form>"
 
@@ -147,15 +148,21 @@ def fileset():
             result = cursor.fetchall()
 
             if widetable == 'true':
+                file_ids = [file['id'] for file in result]
+                cursor.execute(f"SELECT file, checksum, checksize, checktype FROM filechecksum WHERE file IN ({','.join(map(str, file_ids))})")
+                checksums = cursor.fetchall()
+
+                checksum_dict = {}
+                for checksum in checksums:
+                    if checksum['checksize'] != 0:
+                        key = f"{checksum['checktype']}-{checksum['checksize']}"
+                        if checksum['file'] not in checksum_dict:
+                            checksum_dict[checksum['file']] = {}
+                        checksum_dict[checksum['file']][key] = checksum['checksum']
+
                 for index, file in enumerate(result):
-                    cursor.execute(f"SELECT checksum, checksize, checktype FROM filechecksum WHERE file = {file['id']}")
-                    while True:
-                        spec_checksum = cursor.fetchone()
-                        if spec_checksum is None:
-                            break
-                        if spec_checksum['checksize'] == 0:
-                            continue
-                        result[index][f"{spec_checksum['checktype']}-{spec_checksum['checksize']}"] = spec_checksum['checksum']
+                    if file['id'] in checksum_dict:
+                        result[index].update(checksum_dict[file['id']])
 
             counter = 1
             for row in result:
@@ -169,9 +176,6 @@ def fileset():
                 for key, value in row.items():
                     if key != 'id':
                         html += f"<td>{value}</td>\n"
-                    if key == 'detection':
-                        if value == 1:
-                            html += f"<td><button onclick=\"location.href='/fileset/{id}/merge'\">Merge</button></td>"
                 html += "</tr>\n"
                 counter += 1
             html += "</table>\n"
@@ -200,18 +204,20 @@ def fileset():
             html += "<th>Category</th>\n"
             html += "<th>Description</th>\n"
             html += "<th>Log ID</th>\n"
-            cursor.execute(f"SELECT * FROM history")
+            cursor.execute("SELECT * FROM history")
             history = cursor.fetchall()
-            for history_row in history:
-                cursor.execute(f"SELECT `timestamp`, category, `text`, id FROM log WHERE `text` LIKE 'Fileset:{history_row['oldfileset']}%' AND `category` NOT LIKE 'merge%' ORDER BY `timestamp` DESC, id DESC")
-                logs = cursor.fetchall()
-                for log in logs:
-                    html += "<tr>\n"
-                    html += f"<td>{log['timestamp']}</td>\n"
-                    html += f"<td>{log['category']}</td>\n"
-                    html += f"<td>{log['text']}</td>\n"
-                    html += f"<td><a href='logs?id={log['id']}'>{log['id']}</a></td>\n"
-                    html += "</tr>\n"
+
+            oldfilesets = [history_row['oldfileset'] for history_row in history]
+            cursor.execute(f"""SELECT `timestamp`, category, `text`, id FROM log WHERE `text` LIKE 'Fileset:%' AND `category` NOT LIKE 'merge%' AND `text` REGEXP 'Fileset:({"|".join(map(str, oldfilesets))})' ORDER BY `timestamp` DESC, id DESC""")
+            logs = cursor.fetchall()
+
+            for log in logs:
+                html += "<tr>\n"
+                html += f"<td>{log['timestamp']}</td>\n"
+                html += f"<td>{log['category']}</td>\n"
+                html += f"<td>{log['text']}</td>\n"
+                html += f"<td><a href='logs?id={log['id']}'>{log['id']}</a></td>\n"
+                html += "</tr>\n"
             html += "</table>\n"
             return render_template_string(html)
     finally:
@@ -312,11 +318,11 @@ def confirm_merge(id):
 
     try:
         with connection.cursor() as cursor:
-            cursor.execute(f"""SELECT * FROM fileset JOIN game ON game.id = fileset.id WHERE fileset.id = {id}
+            cursor.execute(f"""SELECT * FROM fileset WHERE fileset.id = {id}
             """)
             source_fileset = cursor.fetchone()
-
-            cursor.execute(f"""SELECT * FROM fileset JOIN game ON game.id = fileset.id WHERE fileset.id = {target_id}
+            print(source_fileset)
+            cursor.execute(f"""SELECT * FROM fileset WHERE fileset.id = {target_id}
                             
             """)
             target_fileset = cursor.fetchone()
@@ -333,9 +339,7 @@ def confirm_merge(id):
             <tr><th>Field</th><th>Source Fileset</th><th>Target Fileset</th></tr>
             """
             for column in source_fileset.keys():
-                print(column)
-                if column != 'id':
-                    html += f"<tr><td>{column}</td><td>{source_fileset[column]}</td><td>{target_fileset[column]}</td></tr>"
+                html += f"<tr><td>{column}</td><td>{source_fileset[column]}</td><td>{target_fileset[column]}</td></tr>"
 
             html += """
             </table>


Commit: def33f824e41bf290e0604cf752e2aa57b0b53ee
    https://github.com/scummvm/scummvm-sites/commit/def33f824e41bf290e0604cf752e2aa57b0b53ee
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-21T22:31:52+08:00

Commit Message:
INTEGRITY: Add more info when comparing at confirm page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index c0b343a..ec89515 100644
--- a/fileset.py
+++ b/fileset.py
@@ -318,12 +318,45 @@ def confirm_merge(id):
 
     try:
         with connection.cursor() as cursor:
-            cursor.execute(f"""SELECT * FROM fileset WHERE fileset.id = {id}
+            cursor.execute(f"""
+                SELECT 
+                    fs.*, 
+                    g.name AS game_name, 
+                    g.engine AS game_engine, 
+                    g.platform AS game_platform,
+                    g.language AS game_language,
+                    f.name AS file_name, 
+                    f.size AS file_size, 
+                    f.checksum AS file_checksum 
+                FROM 
+                    fileset fs
+                LEFT JOIN 
+                    game g ON fs.game = g.id
+                LEFT JOIN 
+                    file f ON fs.id = f.fileset
+                WHERE 
+                    fs.id = {id}
             """)
             source_fileset = cursor.fetchone()
             print(source_fileset)
-            cursor.execute(f"""SELECT * FROM fileset WHERE fileset.id = {target_id}
-                            
+            cursor.execute(f"""
+                SELECT 
+                    fs.*, 
+                    g.name AS game_name, 
+                    g.engine AS game_engine, 
+                    g.platform AS game_platform,
+                    g.language AS game_language,
+                    f.name AS file_name, 
+                    f.size AS file_size, 
+                    f.checksum AS file_checksum 
+                FROM 
+                    fileset fs
+                LEFT JOIN 
+                    game g ON fs.game = g.id
+                LEFT JOIN 
+                    file f ON fs.id = f.fileset
+                WHERE 
+                    fs.id = {target_id}
             """)
             target_fileset = cursor.fetchone()
 


Commit: 0ddaa1bd9825dcc68e8dd960377ee1f4a893eec6
    https://github.com/scummvm/scummvm-sites/commit/0ddaa1bd9825dcc68e8dd960377ee1f4a893eec6
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-21T22:33:09+08:00

Commit Message:
INTEGRITY: Highlight difference in the data

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index ec89515..c618873 100644
--- a/fileset.py
+++ b/fileset.py
@@ -5,6 +5,7 @@ import re
 import os
 from user_fileset_functions import user_calc_key, file_json_to_array, user_insert_queue, user_insert_fileset, match_and_merge_user_filesets
 from pagination import create_page
+import difflib
 
 app = Flask(__name__)
 
@@ -360,6 +361,20 @@ def confirm_merge(id):
             """)
             target_fileset = cursor.fetchone()
 
+            def highlight_differences(source, target):
+                diff = difflib.ndiff(source, target)
+                source_highlighted = ""
+                target_highlighted = ""
+                for d in diff:
+                    if d.startswith('-'):
+                        source_highlighted += f"<span style='color: green;'>{d[2:]}</span>"
+                    elif d.startswith('+'):
+                        target_highlighted += f"<span style='color: red;'>{d[2:]}</span>"
+                    elif d.startswith(' '):
+                        source_highlighted += d[2:]
+                        target_highlighted += d[2:]
+                return source_highlighted, target_highlighted
+
             html = """
             <!DOCTYPE html>
             <html>
@@ -368,11 +383,17 @@ def confirm_merge(id):
             </head>
             <body>
             <h2>Confirm Merge</h2>
-            <table>
+            <table border="1">
             <tr><th>Field</th><th>Source Fileset</th><th>Target Fileset</th></tr>
             """
             for column in source_fileset.keys():
-                html += f"<tr><td>{column}</td><td>{source_fileset[column]}</td><td>{target_fileset[column]}</td></tr>"
+                source_value = str(source_fileset[column])
+                target_value = str(target_fileset[column])
+                if source_value != target_value:
+                    source_highlighted, target_highlighted = highlight_differences(source_value, target_value)
+                    html += f"<tr><td>{column}</td><td>{source_highlighted}</td><td>{target_highlighted}</td></tr>"
+                else:
+                    html += f"<tr><td>{column}</td><td>{source_value}</td><td>{target_value}</td></tr>"
 
             html += """
             </table>


Commit: f897ee2e334029e383690f5599be9acd81b86384
    https://github.com/scummvm/scummvm-sites/commit/f897ee2e334029e383690f5599be9acd81b86384
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-22T01:16:58+08:00

Commit Message:
INTEGRITY: Update metadata when megakey matching

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index a59a8d0..74690e2 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -77,10 +77,10 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
 
     # Check if key/megakey already exists, if so, skip insertion (no quotes on purpose)
     with conn.cursor() as cursor:
-        if not detection:
-            cursor.execute(f"SELECT id FROM fileset WHERE `key` = {key}")
-        else:
+        if detection:
             cursor.execute(f"SELECT id FROM fileset WHERE megakey = {megakey}")
+        else:
+            cursor.execute(f"SELECT id FROM fileset WHERE `key` = {key}")
 
         existing_entry = cursor.fetchone()
 
@@ -88,22 +88,14 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         existing_entry = existing_entry['id']
         with conn.cursor() as cursor:
             cursor.execute(f"SET @fileset_last = {existing_entry}")
+            cursor.execute(f"UPDATE fileset SET `timestamp` = FROM_UNIXTIME(@fileset_time_last) WHERE id = {existing_entry}")
+            cursor.execute(f"UPDATE fileset SET status = 'detection' WHERE id = {existing_entry} AND status = 'obsolete'")
 
-        category_text = f"Uploaded from {src}"
-        log_text = f"Duplicate of Fileset:{existing_entry}, {log_text}"
-        if src == 'user':
-            log_text = f"Duplicate of Fileset:{existing_entry}, from user IP {ip}, {log_text}"
-
+        category_text = f"Updated Fileset:{existing_entry}"
+        log_text = f"Updated Fileset:{existing_entry}, {log_text}"
         user = f'cli:{getpass.getuser()}'
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
-        if not detection:
-            return False
-
-        with conn.cursor() as cursor:
-            cursor.execute(f"UPDATE fileset SET `timestamp` = FROM_UNIXTIME(@fileset_time_last) WHERE id = {existing_entry}")
-            cursor.execute(f"UPDATE fileset SET status = 'detection' WHERE id = {existing_entry} AND status = 'obsolete'")
-            cursor.execute("DELETE FROM game WHERE id = @game_last")
         return False
 
     # $game and $key should not be parsed as a mysql string, hence no quotes
@@ -261,7 +253,7 @@ def db_insert(data_arr):
         key = calc_key(fileset) if detection else ""
         megakey = calc_megakey(fileset) if detection else ""
         print(key, megakey)
-        log_text = f"size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'."
+        log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}."
 
         if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn):
             for file in fileset["rom"]:
@@ -278,7 +270,7 @@ def db_insert(data_arr):
         cur.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
         fileset_insertion_count = cur.fetchone()['COUNT(fileset)']
         category_text = f"Uploaded from {src}"
-        log_text = f"Completed loading DAT file, filename '{filepath}', size {os.path.getsize(filepath)}, author '{author}', version {version}. State '{status}'. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
+        log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
     except Exception as e:
         print("Inserting failed:", e)
     else:


Commit: 962b6b900f96218b6857d068c0ff7eb6c1e490d9
    https://github.com/scummvm/scummvm-sites/commit/962b6b900f96218b6857d068c0ff7eb6c1e490d9
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-22T22:17:23+08:00

Commit Message:
INTEGRITY: Add argparse to dat parser

Changed paths:
    dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
index b424334..9490c32 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -2,6 +2,7 @@ import re
 import os
 import sys
 from db_functions import db_insert, populate_matching_games
+import argparse
 
 def remove_quotes(string):
     # Remove quotes from value if they are present
@@ -104,17 +105,21 @@ def parse_dat(dat_filepath):
     # print(header, game_data, resources)
     return header, game_data, resources, dat_filepath
 
-# Process command line args
-if len(sys.argv) == 1:
-    print("Usage: python dat_parser.py [--upload <filepaths>...] [--match]")
-    sys.exit(1)
+def main():
+    parser = argparse.ArgumentParser(description="Process DAT files and interact with the database.")
+    parser.add_argument('--upload', nargs='+', help='Upload DAT file(s) to the database')
+    parser.add_argument('--match', action='store_true', help='Populate matching games in the database')
+    parser.add_argument('--user', help='Username for database')
+    parser.add_argument('-r', help="Recurse through directories", action='store_true')
 
-if "--upload" in sys.argv:
-    index = sys.argv.index("--upload")
-    for filepath in sys.argv[index + 1:]:
-        if filepath == "--match":
-            continue
-        db_insert(parse_dat(filepath))
+    args = parser.parse_args()
+
+    if args.upload:
+        for filepath in args.upload:
+            db_insert(parse_dat(filepath))
+
+    if args.match:
+        populate_matching_games()
 
-if "--match" in sys.argv:
-    populate_matching_games()
+if __name__ == "__main__":
+    main()
\ No newline at end of file


Commit: 412301dc2f19589af91101247b42133244ab46c4
    https://github.com/scummvm/scummvm-sites/commit/412301dc2f19589af91101247b42133244ab46c4
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-22T22:18:22+08:00

Commit Message:
INTEGRIITY: Fix auto merging

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 74690e2..a3ca4d6 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -77,10 +77,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
 
     # Check if key/megakey already exists, if so, skip insertion (no quotes on purpose)
     with conn.cursor() as cursor:
-        if detection:
-            cursor.execute(f"SELECT id FROM fileset WHERE megakey = {megakey}")
-        else:
-            cursor.execute(f"SELECT id FROM fileset WHERE `key` = {key}")
+        cursor.execute(f"SELECT id FROM fileset WHERE megakey = {megakey}")
 
         existing_entry = cursor.fetchone()
 
@@ -88,6 +85,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         existing_entry = existing_entry['id']
         with conn.cursor() as cursor:
             cursor.execute(f"SET @fileset_last = {existing_entry}")
+            cursor.execute(f"DELETE FROM file WHERE fileset = {existing_entry}")
             cursor.execute(f"UPDATE fileset SET `timestamp` = FROM_UNIXTIME(@fileset_time_last) WHERE id = {existing_entry}")
             cursor.execute(f"UPDATE fileset SET status = 'detection' WHERE id = {existing_entry} AND status = 'obsolete'")
 
@@ -96,7 +94,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         user = f'cli:{getpass.getuser()}'
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
-        return False
+        return True
 
     # $game and $key should not be parsed as a mysql string, hence no quotes
     query = f"INSERT INTO fileset (game, status, src, `key`, megakey, `timestamp`) VALUES ({game}, '{status}', '{src}', {key}, {megakey}, FROM_UNIXTIME(@fileset_time_last))"
@@ -194,7 +192,8 @@ def calc_megakey(fileset):
     key_string = f":{fileset['platform']}:{fileset['language']}"
     for file in fileset['rom']:
         for key, value in file.items():
-            key_string += ':' + str(value)
+            if key != "name":
+                key_string += ':' + str(value)
 
     key_string = key_string.strip(':')
     return hashlib.md5(key_string.encode()).hexdigest()
@@ -252,7 +251,6 @@ def db_insert(data_arr):
 
         key = calc_key(fileset) if detection else ""
         megakey = calc_megakey(fileset) if detection else ""
-        print(key, megakey)
         log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}."
 
         if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn):


Commit: 12ae44b1e85f6cbe972310de29afed1a2f49e5cf
    https://github.com/scummvm/scummvm-sites/commit/12ae44b1e85f6cbe972310de29afed1a2f49e5cf
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-22T22:19:05+08:00

Commit Message:
INTEGRITY: Fix manual merging

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index c618873..d5ed21e 100644
--- a/fileset.py
+++ b/fileset.py
@@ -6,6 +6,7 @@ import os
 from user_fileset_functions import user_calc_key, file_json_to_array, user_insert_queue, user_insert_fileset, match_and_merge_user_filesets
 from pagination import create_page
 import difflib
+from pymysql.converters import escape_string
 
 app = Flask(__name__)
 
@@ -244,10 +245,19 @@ def merge_fileset(id):
         try:
             with connection.cursor() as cursor:
                 query = f"""
-                SELECT fileset.id, game.id AS game_id, platform, language, game.name
-                FROM fileset
-                JOIN game ON game.id = fileset.id
-                WHERE game.name LIKE '%{search_query}%' OR platform LIKE '%{search_query}%' OR language LIKE '%{search_query}%'
+                SELECT 
+                    fs.*, 
+                    g.name AS game_name, 
+                    g.engine AS game_engine, 
+                    g.platform AS game_platform,
+                    g.language AS game_language
+                FROM 
+                    fileset fs
+                LEFT JOIN 
+                    game g ON fs.game = g.id
+                LEFT JOIN 
+                    file f ON fs.id = f.fileset
+                WHERE g.name LIKE '%{search_query}%' OR g.platform LIKE '%{search_query}%' OR g.language LIKE '%{search_query}%'
                 """
                 cursor.execute(query)
                 results = cursor.fetchall()
@@ -271,9 +281,9 @@ def merge_fileset(id):
                     html += f"""
                     <tr>
                         <td>{result['id']}</td>
-                        <td>{result['name']}</td>
-                        <td>{result['platform']}</td>
-                        <td>{result['language']}</td>
+                        <td>{result['game_name']}</td>
+                        <td>{result['game_platform']}</td>
+                        <td>{result['game_language']}</td>
                         <td><a href="/fileset/{id}/merge/confirm?target_id={result['id']}">Select</a></td>
                     </tr>
                     """
@@ -325,16 +335,12 @@ def confirm_merge(id):
                     g.name AS game_name, 
                     g.engine AS game_engine, 
                     g.platform AS game_platform,
-                    g.language AS game_language,
-                    f.name AS file_name, 
-                    f.size AS file_size, 
-                    f.checksum AS file_checksum 
+                    g.language AS game_language
+                    (SELECT COUNT(*) FROM file WHERE fileset = fs.id) AS file_count
                 FROM 
                     fileset fs
                 LEFT JOIN 
                     game g ON fs.game = g.id
-                LEFT JOIN 
-                    file f ON fs.id = f.fileset
                 WHERE 
                     fs.id = {id}
             """)
@@ -346,16 +352,12 @@ def confirm_merge(id):
                     g.name AS game_name, 
                     g.engine AS game_engine, 
                     g.platform AS game_platform,
-                    g.language AS game_language,
-                    f.name AS file_name, 
-                    f.size AS file_size, 
-                    f.checksum AS file_checksum 
+                    g.language AS game_language
+                    (SELECT COUNT(*) FROM file WHERE fileset = fs.id) AS file_count
                 FROM 
                     fileset fs
                 LEFT JOIN 
                     game g ON fs.game = g.id
-                LEFT JOIN 
-                    file f ON fs.id = f.fileset
                 WHERE 
                     fs.id = {target_id}
             """)
@@ -389,6 +391,9 @@ def confirm_merge(id):
             for column in source_fileset.keys():
                 source_value = str(source_fileset[column])
                 target_value = str(target_fileset[column])
+                if column == 'id':
+                    html += f"<tr><td>{column}</td><td><a href='/fileset?id={source_value}'>{source_value}</a></td><td><a href='/fileset?id={source_value}'>{target_value}</a></td></tr>"
+                    continue
                 if source_value != target_value:
                     source_highlighted, target_highlighted = highlight_differences(source_value, target_value)
                     html += f"<tr><td>{column}</td><td>{source_highlighted}</td><td>{target_highlighted}</td></tr>"
@@ -445,11 +450,35 @@ def execute_merge(id):
             WHERE id = {target_id}
             """)
                 
+            cursor.execute(f"DELETE FROM file WHERE fileset = {target_id}")
+
+            cursor.execute(f"SELECT * FROM file WHERE fileset = {source_id}")
+            source_files = cursor.fetchall()
+
+            for file in source_files:
+                cursor.execute(f"""
+                INSERT INTO file (name, size, checksum, fileset, detection)
+                VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['checksum']}', {target_id}, {file['detection']})
+                """)
+
+                cursor.execute("SELECT LAST_INSERT_ID() as file_id")
+                new_file_id = cursor.fetchone()['file_id']
+                
+                cursor.execute(f"SELECT * FROM filechecksum WHERE file = {file['id']}")
+                file_checksums = cursor.fetchall()
+
+                for checksum in file_checksums:
+                    cursor.execute(f"""
+                    INSERT INTO filechecksum (file, checksize, checktype, checksum)
+                    VALUES ({new_file_id}, '{checksum['checksize']}', '{checksum['checktype']}', '{checksum['checksum']}')
+                    """)
+
             cursor.execute(f"""
             INSERT INTO history (`timestamp`, fileset, oldfileset, log)
-            VALUES (NOW(), {target_id}, {source_id}, {1})
+            VALUES (NOW(), {target_id}, {source_id}, 1)
             """)
 
+
             connection.commit()
 
             return redirect(url_for('fileset', id=target_id))


Commit: a629f0f90e3ab0912922b43bb55fbf4a0566e16a
    https://github.com/scummvm/scummvm-sites/commit/a629f0f90e3ab0912922b43bb55fbf4a0566e16a
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-23T20:31:31+08:00

Commit Message:
INTEGRITY: Overriding user via command line

Changed paths:
    dat_parser.py
    db_functions.py


diff --git a/dat_parser.py b/dat_parser.py
index 9490c32..bda6cec 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -116,7 +116,7 @@ def main():
 
     if args.upload:
         for filepath in args.upload:
-            db_insert(parse_dat(filepath))
+            db_insert(parse_dat(filepath), args.user)
 
     if args.match:
         populate_matching_games()
diff --git a/db_functions.py b/db_functions.py
index a3ca4d6..5f921a5 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -65,7 +65,7 @@ def insert_game(engine_name, engineid, title, gameid, extra, platform, lang, con
         cursor.execute(f"INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES ('{escape_string(title)}', @engine_last, '{gameid}', '{escape_string(extra)}', '{platform}', '{lang}')")
         cursor.execute("SET @game_last = LAST_INSERT_ID()")
 
-def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip=''):
+def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip='', username=None):
     status = "detection" if detection else src
     game = "NULL"
     key = "NULL" if key == "" else f"'{key}'"
@@ -91,7 +91,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
 
         category_text = f"Updated Fileset:{existing_entry}"
         log_text = f"Updated Fileset:{existing_entry}, {log_text}"
-        user = f'cli:{getpass.getuser()}'
+        user = f'cli:{getpass.getuser()}' if username is None else username
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
         return True
@@ -111,7 +111,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
     if src == 'user':
         log_text = f"Created Fileset:{fileset_last}, from user IP {ip}, {log_text}"
 
-    user = f'cli:{getpass.getuser()}'
+    user = f'cli:{getpass.getuser()}' if username is None else username
     create_log(escape_string(category_text), user, escape_string(log_text), conn)
     with conn.cursor() as cursor:
         cursor.execute(f"INSERT INTO transactions (`transaction`, fileset) VALUES ({transaction}, {fileset_last})")
@@ -198,7 +198,7 @@ def calc_megakey(fileset):
     key_string = key_string.strip(':')
     return hashlib.md5(key_string.encode()).hexdigest()
 
-def db_insert(data_arr):
+def db_insert(data_arr, username=None):
     header = data_arr[0]
     game_data = data_arr[1]
     resources = data_arr[2]
@@ -231,7 +231,7 @@ def db_insert(data_arr):
     category_text = f"Uploaded from {src}"
     log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}. Transaction: {transaction_id}"
 
-    user = f'cli:{getpass.getuser()}'
+    user = f'cli:{getpass.getuser()}' if username is None else username
     create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
     for fileset in game_data:
@@ -253,7 +253,7 @@ def db_insert(data_arr):
         megakey = calc_megakey(fileset) if detection else ""
         log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}."
 
-        if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn):
+        if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=username):
             for file in fileset["rom"]:
                 insert_file(file, detection, src, conn)
                 for key, value in file.items():
@@ -272,7 +272,7 @@ def db_insert(data_arr):
     except Exception as e:
         print("Inserting failed:", e)
     else:
-        user = f'cli:{getpass.getuser()}'
+        user = f'cli:{getpass.getuser()}' if username is None else username
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
 def compare_filesets(id1, id2, conn):


Commit: e3ba290aa3b5ba8b676db0cfc5df23a4496166ce
    https://github.com/scummvm/scummvm-sites/commit/e3ba290aa3b5ba8b676db0cfc5df23a4496166ce
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-24T20:48:31+08:00

Commit Message:
INTEGRITY: Add more info at select page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index d5ed21e..84a1d71 100644
--- a/fileset.py
+++ b/fileset.py
@@ -136,10 +136,10 @@ def fileset():
                 if k != 'widetable':
                     html += f"<input type='hidden' name='{k}' value='{v}'>"
             if widetable == 'true':
-                html += "<input class='hidden' type='text' name='widetable' value='true' />"
+                html += "<input class='hidden' type='text' name='widetable' value='false' />"
                 html += "<input type='submit' value='Hide extra checksums' />"
             else:
-                html += "<input class='hidden' type='text' name='widetable' value='false' />"
+                html += "<input class='hidden' type='text' name='widetable' value='true' />"
                 html += "<input type='submit' value='Expand Table' />"
             html += "</form>"
 
@@ -250,13 +250,12 @@ def merge_fileset(id):
                     g.name AS game_name, 
                     g.engine AS game_engine, 
                     g.platform AS game_platform,
-                    g.language AS game_language
+                    g.language AS game_language,
+                    g.extra AS extra
                 FROM 
                     fileset fs
                 LEFT JOIN 
                     game g ON fs.game = g.id
-                LEFT JOIN 
-                    file f ON fs.id = f.fileset
                 WHERE g.name LIKE '%{search_query}%' OR g.platform LIKE '%{search_query}%' OR g.language LIKE '%{search_query}%'
                 """
                 cursor.execute(query)
@@ -275,7 +274,7 @@ def merge_fileset(id):
                     <input type="submit" value="Search">
                 </form>
                 <table>
-                <tr><th>ID</th><th>Game Name</th><th>Platform</th><th>Language</th><th>Action</th></tr>
+                <tr><th>ID</th><th>Game Name</th><th>Platform</th><th>Language</th><th>Extra</th><th>Action</th></tr>
                 """
                 for result in results:
                     html += f"""
@@ -284,6 +283,7 @@ def merge_fileset(id):
                         <td>{result['game_name']}</td>
                         <td>{result['game_platform']}</td>
                         <td>{result['game_language']}</td>
+                        <td>{result['extra']}</td>
                         <td><a href="/fileset/{id}/merge/confirm?target_id={result['id']}">Select</a></td>
                     </tr>
                     """
@@ -335,7 +335,7 @@ def confirm_merge(id):
                     g.name AS game_name, 
                     g.engine AS game_engine, 
                     g.platform AS game_platform,
-                    g.language AS game_language
+                    g.language AS game_language,
                     (SELECT COUNT(*) FROM file WHERE fileset = fs.id) AS file_count
                 FROM 
                     fileset fs
@@ -352,7 +352,7 @@ def confirm_merge(id):
                     g.name AS game_name, 
                     g.engine AS game_engine, 
                     g.platform AS game_platform,
-                    g.language AS game_language
+                    g.language AS game_language,
                     (SELECT COUNT(*) FROM file WHERE fileset = fs.id) AS file_count
                 FROM 
                     fileset fs


Commit: 8d93163fb7fc29eca1f272c954a5c54814b30e42
    https://github.com/scummvm/scummvm-sites/commit/8d93163fb7fc29eca1f272c954a5c54814b30e42
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-24T20:49:24+08:00

Commit Message:
INTEGRITY: Remove redundant bar

Changed paths:
    pagination.py


diff --git a/pagination.py b/pagination.py
index 62cf06a..c6b38d2 100644
--- a/pagination.py
+++ b/pagination.py
@@ -152,27 +152,6 @@ def create_page(filename, results_per_page, records_table, select_query, order,
                         # Filter textbox
                         filter_value = request.args.get(key, "")
 
-                        html += f"<td class='filter'><input type='text' class='filter' placeholder='{key}' name='{key}' value='{filter_value}'/></td>\n"
-                    html += "</tr>"
-                    html += "<tr class='filter'><td></td><td class='filter'><input type='submit' value='Submit'></td></tr>"
-
-                html += "<th></th>\n"  # Numbering column
-                for key in row.keys():
-                    if key == 'fileset':
-                        continue
-
-                    # Preserve GET variables
-                    vars = "&".join([f"{k}={v}" for k, v in request.args.items() if k != 'sort'])
-                    if request.args.get('sort', '') == key:
-                        vars += f"&sort={key}-desc"
-                    else:
-                        vars += f"&sort={key}"
-
-                    if f"&sort={key}" not in vars:
-                        html += f"<th><a href='{filename}?{vars}&sort={key}'>{key}</th>\n"
-                    else:
-                        html += f"<th><a href='{filename}?{vars}'>{key}</th>\n"
-
             if filename in ['games_list', 'user_games_list']:
                 html += f"<tr class='games_list' onclick='hyperlink(\"fileset?id={row['fileset']}\")'>\n"
             else:


Commit: b36e57aaea88c1005f4fd7c9d0384849554c737a
    https://github.com/scummvm/scummvm-sites/commit/b36e57aaea88c1005f4fd7c9d0384849554c737a
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-25T20:06:16+08:00

Commit Message:
INTEGRITY: Fix the parser of scan dat

Changed paths:
    dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
index bda6cec..f7add93 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -14,7 +14,7 @@ def remove_quotes(string):
 def map_checksum_data(content_string):
     arr = []
     
-    rom_props = re.findall(r'(\w+)\s+"([^"]*)"\s+size\s+(\d+)\s+md5-5000\s+([a-f0-9]+)', content_string)
+    rom_props = re.findall(r'(\w+)\s+"([^"]*)"\s+size\s+(\d+)((?:\s+md5(?:-\w+)?(?:-\w+)?\s+[a-f0-9]+)*)', content_string)
 
     for prop in rom_props:
         key, name, size, md5 = prop


Commit: 92d3dfff1cec3633e3ccd877f15c158b8c0f09d7
    https://github.com/scummvm/scummvm-sites/commit/92d3dfff1cec3633e3ccd877f15c158b8c0f09d7
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-25T20:07:44+08:00

Commit Message:
INTEGRITY: Fix the parser of scan dat

Changed paths:
    dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
index f7add93..c518f94 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -17,8 +17,13 @@ def map_checksum_data(content_string):
     rom_props = re.findall(r'(\w+)\s+"([^"]*)"\s+size\s+(\d+)((?:\s+md5(?:-\w+)?(?:-\w+)?\s+[a-f0-9]+)*)', content_string)
 
     for prop in rom_props:
-        key, name, size, md5 = prop
-        item = {'name': name, 'size': int(size), 'md5-5000': md5}
+        key, name, size, md5s_str = prop
+        item = {'name': name, 'size': int(size)}
+
+        md5s = re.findall(r'(md5(?:-\w+)?(?:-\w+)?)\s+([a-f0-9]+)', md5s_str)
+        for md5_key, md5_value in md5s:
+            item[md5_key] = md5_value
+        
         arr.append(item)
 
     return arr


Commit: e3a7044a3fe3532386e1e0a43ed9d16e15e72750
    https://github.com/scummvm/scummvm-sites/commit/e3a7044a3fe3532386e1e0a43ed9d16e15e72750
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-25T20:09:01+08:00

Commit Message:
INTEGRITY: Handle the dups of scan

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 5f921a5..a0c9ba1 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -76,10 +76,16 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         game = "@game_last"
 
     # Check if key/megakey already exists, if so, skip insertion (no quotes on purpose)
-    with conn.cursor() as cursor:
-        cursor.execute(f"SELECT id FROM fileset WHERE megakey = {megakey}")
+    if detection:
+        with conn.cursor() as cursor:
+            cursor.execute(f"SELECT id FROM fileset WHERE megakey = {megakey}")
+
+            existing_entry = cursor.fetchone()
+    else:
+        with conn.cursor() as cursor:
+            cursor.execute(f"SELECT id FROM fileset WHERE `key` = {key}")
 
-        existing_entry = cursor.fetchone()
+            existing_entry = cursor.fetchone()
 
     if existing_entry is not None:
         existing_entry = existing_entry['id']
@@ -249,7 +255,7 @@ def db_insert(data_arr, username=None):
             if 'romof' in fileset and fileset['romof'] in resources:
                 fileset["rom"] = fileset["rom"] + resources[fileset["romof"]]["rom"]
 
-        key = calc_key(fileset) if detection else ""
+        key = calc_key(fileset) if not detection else ""
         megakey = calc_megakey(fileset) if detection else ""
         log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}."
 


Commit: afdbbe867ac30f66dcb8f87a7cfb1af576b9218c
    https://github.com/scummvm/scummvm-sites/commit/afdbbe867ac30f66dcb8f87a7cfb1af576b9218c
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-25T20:10:31+08:00

Commit Message:
INTEGRITY: Fix the missing bar of widetable

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 84a1d71..d37d4d6 100644
--- a/fileset.py
+++ b/fileset.py
@@ -149,6 +149,9 @@ def fileset():
             cursor.execute(f"SELECT file.id, name, size, checksum, detection FROM file WHERE fileset = {id}")
             result = cursor.fetchall()
 
+            all_columns = list(result[0].keys())
+            temp_set = set()
+
             if widetable == 'true':
                 file_ids = [file['id'] for file in result]
                 cursor.execute(f"SELECT file, checksum, checksize, checktype FROM filechecksum WHERE file IN ({','.join(map(str, file_ids))})")
@@ -161,22 +164,29 @@ def fileset():
                         if checksum['file'] not in checksum_dict:
                             checksum_dict[checksum['file']] = {}
                         checksum_dict[checksum['file']][key] = checksum['checksum']
+                        temp_set.add(key)
 
                 for index, file in enumerate(result):
                     if file['id'] in checksum_dict:
                         result[index].update(checksum_dict[file['id']])
 
+            all_columns.extend(list(temp_set))
             counter = 1
+            # Generate table header
+            html += "<tr>\n"
+            html += "<th/>"  # Numbering column
+            for column in all_columns:
+                if column != 'id':
+                    html += f"<th>{column}</th>\n"
+            html += "</tr>\n"
+
+            # Generate table rows
             for row in result:
-                if counter == 1:
-                    html += "<th/>\n" # Numbering column
-                    for key in row.keys():
-                        if key != 'id':
-                            html += f"<th>{key}</th>\n"
                 html += "<tr>\n"
                 html += f"<td>{counter}.</td>\n"
-                for key, value in row.items():
-                    if key != 'id':
+                for column in all_columns:
+                    if column != 'id':
+                        value = row.get(column, '')
                         html += f"<td>{value}</td>\n"
                 html += "</tr>\n"
                 counter += 1


Commit: 9c8fced59146522f0a332973587daa718e0194c6
    https://github.com/scummvm/scummvm-sites/commit/9c8fced59146522f0a332973587daa718e0194c6
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-25T20:11:42+08:00

Commit Message:
INTEGRITY: Manual merge into full fileset

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index d37d4d6..cc39e30 100644
--- a/fileset.py
+++ b/fileset.py
@@ -402,7 +402,7 @@ def confirm_merge(id):
                 source_value = str(source_fileset[column])
                 target_value = str(target_fileset[column])
                 if column == 'id':
-                    html += f"<tr><td>{column}</td><td><a href='/fileset?id={source_value}'>{source_value}</a></td><td><a href='/fileset?id={source_value}'>{target_value}</a></td></tr>"
+                    html += f"<tr><td>{column}</td><td><a href='/fileset?id={source_value}'>{source_value}</a></td><td><a href='/fileset?id={target_value}'>{target_value}</a></td></tr>"
                     continue
                 if source_value != target_value:
                     source_highlighted, target_highlighted = highlight_differences(source_value, target_value)
@@ -450,45 +450,98 @@ def execute_merge(id):
             cursor.execute(f"SELECT * FROM fileset WHERE id = {source_id}")
             source_fileset = cursor.fetchone()
 
-            cursor.execute(f"""
-            UPDATE fileset SET
-                game = '{source_fileset['game']}',
-                status = '{source_fileset['status']}',
-                `key` = '{source_fileset['key']}',
-                megakey = '{source_fileset['megakey']}',
-                `timestamp` = '{source_fileset['timestamp']}'
-            WHERE id = {target_id}
-            """)
+            if source_fileset['src'] == 'detection':
+                cursor.execute(f"""
+                UPDATE fileset SET
+                    game = '{source_fileset['game']}',
+                    status = '{source_fileset['status']}',
+                    `key` = '{source_fileset['key']}',
+                    megakey = '{source_fileset['megakey']}',
+                    `timestamp` = '{source_fileset['timestamp']}'
+                WHERE id = {target_id}
+                """)
                 
-            cursor.execute(f"DELETE FROM file WHERE fileset = {target_id}")
+                cursor.execute(f"DELETE FROM file WHERE fileset = {target_id}")
 
-            cursor.execute(f"SELECT * FROM file WHERE fileset = {source_id}")
-            source_files = cursor.fetchall()
+                cursor.execute(f"SELECT * FROM file WHERE fileset = {source_id}")
+                source_files = cursor.fetchall()
 
-            for file in source_files:
-                cursor.execute(f"""
-                INSERT INTO file (name, size, checksum, fileset, detection)
-                VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['checksum']}', {target_id}, {file['detection']})
-                """)
+                for file in source_files:
+                    cursor.execute(f"""
+                    INSERT INTO file (name, size, checksum, fileset, detection)
+                    VALUES ('{escape_string(file['name']).lower()}', '{file['size']}', '{file['checksum']}', {target_id}, {file['detection']})
+                    """)
 
-                cursor.execute("SELECT LAST_INSERT_ID() as file_id")
-                new_file_id = cursor.fetchone()['file_id']
-                
-                cursor.execute(f"SELECT * FROM filechecksum WHERE file = {file['id']}")
-                file_checksums = cursor.fetchall()
+                    cursor.execute("SELECT LAST_INSERT_ID() as file_id")
+                    new_file_id = cursor.fetchone()['file_id']
+                    
+                    cursor.execute(f"SELECT * FROM filechecksum WHERE file = {file['id']}")
+                    file_checksums = cursor.fetchall()
 
-                for checksum in file_checksums:
+                    for checksum in file_checksums:
+                        cursor.execute(f"""
+                        INSERT INTO filechecksum (file, checksize, checktype, checksum)
+                        VALUES ({new_file_id}, '{checksum['checksize']}', '{checksum['checktype']}', '{checksum['checksum']}')
+                        """)
+
+            elif source_fileset['src'] == 'scan':
+                cursor.execute(f"SELECT * FROM file WHERE fileset = {source_id}")
+                source_files = cursor.fetchall()
+
+                for file in source_files:
+                    filename = escape_string(file['name']).lower()
                     cursor.execute(f"""
-                    INSERT INTO filechecksum (file, checksize, checktype, checksum)
-                    VALUES ({new_file_id}, '{checksum['checksize']}', '{checksum['checktype']}', '{checksum['checksum']}')
+                    SELECT file.id, file.detection
+                    FROM file
+                    JOIN filechecksum ON file.id = filechecksum.file
+                    WHERE filechecksum.checksum = '{file['checksum']}' AND file.fileset = {target_id}
                     """)
+                    existing_file = cursor.fetchone()
+
+                    if existing_file:
+                        cursor.execute(f"""
+                        UPDATE file SET
+                            name = '{filename}',
+                            size = '{file['size']}'
+                        WHERE id = {existing_file['id']}
+                        """)
+
+                        cursor.execute(f"SELECT * FROM filechecksum WHERE file = {file['id']}")
+                        file_checksums = cursor.fetchall()
+
+                        for checksum in file_checksums:
+                            cursor.execute(f"""
+                            INSERT INTO filechecksum (file, checksize, checktype, checksum)
+                            VALUES ({existing_file['id']}, '{checksum['checksize']}', '{checksum['checktype']}', '{checksum['checksum']}')
+                            ON DUPLICATE KEY UPDATE
+                                checksize = VALUES(checksize),
+                                checktype = VALUES(checktype),
+                                checksum = VALUES(checksum)
+                            """)
+
+                    else:
+                        cursor.execute(f"""
+                        INSERT INTO file (name, size, checksum, fileset, detection)
+                        VALUES ('{filename}', '{file['size']}', '{file['checksum']}', {target_id}, {file['detection']})
+                        """)
+
+                        cursor.execute("SELECT LAST_INSERT_ID() as file_id")
+                        new_file_id = cursor.fetchone()['file_id']
+                        
+                        cursor.execute(f"SELECT * FROM filechecksum WHERE file = {file['id']}")
+                        file_checksums = cursor.fetchall()
+
+                        for checksum in file_checksums:
+                            cursor.execute(f"""
+                            INSERT INTO filechecksum (file, checksize, checktype, checksum)
+                            VALUES ({new_file_id}, '{checksum['checksize']}', '{checksum['checktype']}', '{checksum['checksum']}')
+                            """)
 
             cursor.execute(f"""
-            INSERT INTO history (`timestamp`, fileset, oldfileset, log)
-            VALUES (NOW(), {target_id}, {source_id}, 1)
+            INSERT INTO history (`timestamp`, fileset, oldfileset)
+            VALUES (NOW(), {target_id}, {source_id})
             """)
 
-
             connection.commit()
 
             return redirect(url_for('fileset', id=target_id))


Commit: 5a1fb63dc9bcb9b3b4cda16654c9595541e6a2f7
    https://github.com/scummvm/scummvm-sites/commit/5a1fb63dc9bcb9b3b4cda16654c9595541e6a2f7
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-26T19:27:13+08:00

Commit Message:
INTEGRITY: Add check to the topbar of "Files in the fileset"

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index cc39e30..0349865 100644
--- a/fileset.py
+++ b/fileset.py
@@ -149,7 +149,7 @@ def fileset():
             cursor.execute(f"SELECT file.id, name, size, checksum, detection FROM file WHERE fileset = {id}")
             result = cursor.fetchall()
 
-            all_columns = list(result[0].keys())
+            all_columns = list(result[0].keys()) if result else []
             temp_set = set()
 
             if widetable == 'true':


Commit: d599894b86411ea48cbdc43bd282a917df389004
    https://github.com/scummvm/scummvm-sites/commit/d599894b86411ea48cbdc43bd282a917df389004
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-26T19:40:59+08:00

Commit Message:
INTEGRITY: Update more info while manual merging

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 0349865..fdf13b9 100644
--- a/fileset.py
+++ b/fileset.py
@@ -485,6 +485,13 @@ def execute_merge(id):
                         """)
 
             elif source_fileset['src'] == 'scan':
+                cursor.execute(f"""
+                UPDATE fileset SET
+                    status = '{source_fileset['status']}',
+                    `key` = '{source_fileset['key']}',
+                    `timestamp` = '{source_fileset['timestamp']}'
+                WHERE id = {target_id}
+                """)
                 cursor.execute(f"SELECT * FROM file WHERE fileset = {source_id}")
                 source_files = cursor.fetchall()
 


Commit: 5db72f3e21d824607eb7ab27e1940bbbcefe3d03
    https://github.com/scummvm/scummvm-sites/commit/5db72f3e21d824607eb7ab27e1940bbbcefe3d03
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-26T20:12:27+08:00

Commit Message:
INTEGRITY: Remove redundant caption

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index fdf13b9..a7df054 100644
--- a/fileset.py
+++ b/fileset.py
@@ -100,7 +100,6 @@ def fileset():
         </head>
         <body>
         <h2><u>Fileset: {id}</u></h2>
-        <h3>Fileset details</h3>
         <table>
         """
             html += f"<td><button onclick=\"location.href='/fileset/{id}/merge'\">Merge</button></td>"


Commit: 832914e809a7b9bd3cf508703c80fdbd6c3e561d
    https://github.com/scummvm/scummvm-sites/commit/832914e809a7b9bd3cf508703c80fdbd6c3e561d
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-28T18:20:36+08:00

Commit Message:
INTEGRITY: Fix bugs when merging scan into detection

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index a7df054..33a2f02 100644
--- a/fileset.py
+++ b/fileset.py
@@ -428,9 +428,9 @@ def confirm_merge(id):
         connection.close()
 
 @app.route('/fileset/<int:id>/merge/execute', methods=['POST'])
-def execute_merge(id):
-    source_id = request.form['source_id']
-    target_id = request.form['target_id']
+def execute_merge(id, source=None, target=None):
+    source_id = request.form['source_id'] if not source else source
+    target_id = request.form['target_id'] if not target else target
 
     with open('mysql_config.json') as f:
         mysql_cred = json.load(f)
@@ -448,6 +448,8 @@ def execute_merge(id):
         with connection.cursor() as cursor:
             cursor.execute(f"SELECT * FROM fileset WHERE id = {source_id}")
             source_fileset = cursor.fetchone()
+            cursor.execute(f"SELECT * FROM fileset WHERE id = {target_id}")
+            target_fileset = cursor.fetchone()
 
             if source_fileset['src'] == 'detection':
                 cursor.execute(f"""
@@ -493,55 +495,39 @@ def execute_merge(id):
                 """)
                 cursor.execute(f"SELECT * FROM file WHERE fileset = {source_id}")
                 source_files = cursor.fetchall()
-
-                for file in source_files:
-                    filename = escape_string(file['name']).lower()
-                    cursor.execute(f"""
-                    SELECT file.id, file.detection
-                    FROM file
-                    JOIN filechecksum ON file.id = filechecksum.file
-                    WHERE filechecksum.checksum = '{file['checksum']}' AND file.fileset = {target_id}
-                    """)
-                    existing_file = cursor.fetchone()
-
-                    if existing_file:
-                        cursor.execute(f"""
-                        UPDATE file SET
-                            name = '{filename}',
-                            size = '{file['size']}'
-                        WHERE id = {existing_file['id']}
-                        """)
-
-                        cursor.execute(f"SELECT * FROM filechecksum WHERE file = {file['id']}")
-                        file_checksums = cursor.fetchall()
-
-                        for checksum in file_checksums:
-                            cursor.execute(f"""
-                            INSERT INTO filechecksum (file, checksize, checktype, checksum)
-                            VALUES ({existing_file['id']}, '{checksum['checksize']}', '{checksum['checktype']}', '{checksum['checksum']}')
-                            ON DUPLICATE KEY UPDATE
-                                checksize = VALUES(checksize),
-                                checktype = VALUES(checktype),
-                                checksum = VALUES(checksum)
-                            """)
-
-                    else:
-                        cursor.execute(f"""
-                        INSERT INTO file (name, size, checksum, fileset, detection)
-                        VALUES ('{filename}', '{file['size']}', '{file['checksum']}', {target_id}, {file['detection']})
-                        """)
-
-                        cursor.execute("SELECT LAST_INSERT_ID() as file_id")
-                        new_file_id = cursor.fetchone()['file_id']
-                        
-                        cursor.execute(f"SELECT * FROM filechecksum WHERE file = {file['id']}")
-                        file_checksums = cursor.fetchall()
-
-                        for checksum in file_checksums:
-                            cursor.execute(f"""
-                            INSERT INTO filechecksum (file, checksize, checktype, checksum)
-                            VALUES ({new_file_id}, '{checksum['checksize']}', '{checksum['checktype']}', '{checksum['checksum']}')
-                            """)
+                
+                cursor.execute(f"SELECT * FROM file WHERE fileset = {target_id}")
+                target_files = cursor.fetchall()
+
+                target_files_dict = {}
+                for target_file in target_files:
+                    cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
+                    target_checksums = cursor.fetchall()
+                    for checksum in target_checksums:
+                        target_files_dict[checksum['checksum']] = target_file
+                
+                for source_file in source_files:
+                    cursor.execute(f"SELECT * FROM filechecksum WHERE file = {source_file['id']}")
+                    source_checksums = cursor.fetchall()
+                    file_exists = False
+                    for checksum in source_checksums:
+                        print(checksum['checksum'])
+                        if checksum['checksum'] in target_files_dict.keys():
+                            target_file = target_files_dict[checksum['checksum']]
+                            source_file['detection'] = target_file['detection']
+
+                            cursor.execute(f"DELETE FROM file WHERE id = {target_file['id']}")
+                            file_exists = True
+                            break
+                    print(file_exists)
+                    cursor.execute("INSERT INTO file (name, size, checksum, fileset, detection) VALUES (%s, %s, %s, %s, %s)",
+                                   (source_file['name'], source_file['size'], source_file['checksum'], target_id, source_file['detection']))
+                    new_file_id = cursor.lastrowid
+                    for checksum in source_checksums:
+                        # TODO: Handle the string
+
+                        cursor.execute("INSERT INTO filechecksum (file, checksize, checktype, checksum) VALUES (%s, %s, %s, %s)",
+                                    (new_file_id, checksum['checksize'], f"{checksum['checktype']}-{checksum['checksize']}", checksum['checksum']))
 
             cursor.execute(f"""
             INSERT INTO history (`timestamp`, fileset, oldfileset)


Commit: 3f7fb0e688802815ce45dd4a9e143eef8e5447f0
    https://github.com/scummvm/scummvm-sites/commit/3f7fb0e688802815ce45dd4a9e143eef8e5447f0
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-28T19:49:58+08:00

Commit Message:
INTEGRITY: Start the automatic merge for the scan(unfinished)

Changed paths:
    dat_parser.py
    db_functions.py


diff --git a/dat_parser.py b/dat_parser.py
index c518f94..9eef11a 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -1,7 +1,7 @@
 import re
 import os
 import sys
-from db_functions import db_insert, populate_matching_games
+from db_functions import db_insert, populate_matching_games, match_fileset
 import argparse
 
 def remove_quotes(string):
@@ -113,7 +113,7 @@ def parse_dat(dat_filepath):
 def main():
     parser = argparse.ArgumentParser(description="Process DAT files and interact with the database.")
     parser.add_argument('--upload', nargs='+', help='Upload DAT file(s) to the database')
-    parser.add_argument('--match', action='store_true', help='Populate matching games in the database')
+    parser.add_argument('--match', nargs='+', help='Populate matching games in the database')
     parser.add_argument('--user', help='Username for database')
     parser.add_argument('-r', help="Recurse through directories", action='store_true')
 
@@ -124,7 +124,8 @@ def main():
             db_insert(parse_dat(filepath), args.user)
 
     if args.match:
-        populate_matching_games()
+        for filepath in args.match:
+            match_fileset(parse_dat(filepath), args.user)
 
 if __name__ == "__main__":
     main()
\ No newline at end of file
diff --git a/db_functions.py b/db_functions.py
index a0c9ba1..17cff5c 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -458,4 +458,65 @@ def populate_matching_games():
         try:
             conn.commit()
         except:
-            print("Updating matched games failed")
\ No newline at end of file
+            print("Updating matched games failed")
+            
+def match_fileset(data_arr, username=None):
+    header = data_arr[0]
+    game_data = data_arr[1]
+    resources = data_arr[2]
+    filepath = data_arr[3]
+
+    try:
+        conn = db_connect()
+    except Exception as e:
+        print(f"Failed to connect to database: {e}")
+        return
+
+    try:
+        author = header["author"]
+        version = header["version"]
+    except KeyError as e:
+        print(f"Missing key in header: {e}")
+        return
+    
+    src = "dat" if author not in ["scan", "scummvm"] else author
+    detection = (src == "scummvm")
+    status = "detection" if detection else src
+    user = f'cli:{getpass.getuser()}' if username is None else username
+    
+    for fileset in game_data:
+        if detection:
+            engine_name = fileset["engine"]
+            engineid = fileset["sourcefile"]
+            gameid = fileset["name"]
+            title = fileset["title"]
+            extra = fileset["extra"]
+            platform = fileset["platform"]
+            lang = fileset["language"]
+
+            insert_game(engine_name, engineid, title, gameid, extra, platform, lang, conn)
+        elif src == "dat":
+            if 'romof' in fileset and fileset['romof'] in resources:
+                fileset["rom"] = fileset["rom"] + resources[fileset["romof"]]["rom"]
+
+        key = calc_key(fileset) if not detection else ""
+        megakey = calc_megakey(fileset) if detection else ""
+        log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}."
+
+        for file in fileset["rom"]:
+            for key, value in file.items():
+                if key not in ["name", "size"]:
+                    md5type = key
+                    checksum = value
+                    query = f"""SELECT DISTINCT fs.id AS fileset_id
+                                FROM fileset fs
+                                JOIN file f ON fs.id = f.fileset
+                                JOIN filechecksum fc ON f.id = fc.file
+                                WHERE fc.checksum = '{checksum}' AND fc.checktype = '{md5type}'
+                                AND fs.status IN ('detection', 'dat', 'scan', 'partialmatch', 'fullmatch')"""
+
+                    with conn.cursor() as cursor:
+                        cursor.execute(query)
+                        records = cursor.fetchall()
+                    # TODO: Implement the rest of the function
+                        
\ No newline at end of file


Commit: 2a15a6d03dfd0f47b205dfa7f3ad4ce07c4c0fb0
    https://github.com/scummvm/scummvm-sites/commit/2a15a6d03dfd0f47b205dfa7f3ad4ce07c4c0fb0
Author: InariInDream (inariindream at 163.com)
Date: 2024-06-29T20:53:18+08:00

Commit Message:
INTEGRITY: Handle file dups during automatic merging

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 17cff5c..849616e 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -6,6 +6,7 @@ import time
 import hashlib
 import os
 from pymysql.converters import escape_string
+from collections import defaultdict
 
 def db_connect():
     with open('mysql_config.json') as f:
@@ -503,20 +504,106 @@ def match_fileset(data_arr, username=None):
         megakey = calc_megakey(fileset) if detection else ""
         log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}."
 
-        for file in fileset["rom"]:
-            for key, value in file.items():
-                if key not in ["name", "size"]:
-                    md5type = key
-                    checksum = value
-                    query = f"""SELECT DISTINCT fs.id AS fileset_id
-                                FROM fileset fs
-                                JOIN file f ON fs.id = f.fileset
-                                JOIN filechecksum fc ON f.id = fc.file
-                                WHERE fc.checksum = '{checksum}' AND fc.checktype = '{md5type}'
-                                AND fs.status IN ('detection', 'dat', 'scan', 'partialmatch', 'fullmatch')"""
-
-                    with conn.cursor() as cursor:
+        matched_map = defaultdict(int)
+        try:
+            with conn.cursor() as cursor:   
+                for file in fileset["rom"]:
+                    for key, value in file.items():
+                        if key not in ["name", "size"]:
+                            md5type = key
+                            checksum = value
+                            query = f"""SELECT DISTINCT fs.id AS fileset_id
+                                        FROM fileset fs
+                                        JOIN file f ON fs.id = f.fileset
+                                        JOIN filechecksum fc ON f.id = fc.file
+                                        WHERE fc.checksum = '{checksum}' AND fc.checktype = '{md5type}'
+                                        AND fs.status IN ('detection', 'dat', 'scan', 'partialmatch', 'fullmatch')"""
+
+                            
+                            cursor.execute(query)
+                            records = cursor.fetchall()
+                            # print(records)
+                            if records:
+                                for record in records:
+                                    matched_map[record['fileset_id']] += 1
+                
+                matched_list = sorted(matched_map.items(), key=lambda x: x[1], reverse=True)
+                if matched_list:
+                    for matched_fileset_id, matched_count in matched_list:
+                        query = f"SELECT status FROM fileset WHERE id = {matched_fileset_id}"
+
                         cursor.execute(query)
-                        records = cursor.fetchall()
-                    # TODO: Implement the rest of the function
+                        status = cursor.fetchone()['status']
+
+                    if status == 'scan':
+                        query = f"SELECT COUNT(file.id) FROM file WHERE fileset = {matched_fileset_id}" 
+                        cursor.execute(query)
+                        count = cursor.fetchone()['COUNT(file.id)']
+                            
+                        if count == matched_count:
+                            # full match
+                            cursor.execute(f"""
+                                           UPDATE fileset SET 
+                                                status = 'fullmatch', 
+                                                `timestamp` = FROM_UNIXTIME({int(time.time())})
+                                            WHERE id = {matched_fileset_id}""")
+                            cursor.execute(f"SELECT * FROM file WHERE fileset = {matched_fileset_id}")
+                            target_files = cursor.fetchall()
+                            
+                            target_files_dict = {}
+                            for target_file in target_files:
+                                cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
+                                target_checksums = cursor.fetchall()
+                                for checksum in target_checksums:
+                                    target_files_dict[checksum['checksum']] = target_file
+                            for file in fileset['rom']:
+                                file_exists = False
+                                for key, value in file.items():
+                                    print(key, value)
+                                    if key not in ["name", "size"]:
+                                        scan_md5type = key
+                                        scan_checksum = value
+                                        if scan_checksum in target_files_dict.keys():
+                                            file_exists = True
+                                            cursor.execute(f"DELETE FROM file WHERE id = {target_files_dict[scan_checksum]['id']}")
+                                            break
+                                print(file_exists)
+                                cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{scan_checksum}', {matched_fileset_id}, {0})")
+                                # TODO: insert filechecksum
+                                              
+                            # log
+                            category_text = f"Matched from {src}"
+                            log_text = f"Matched game {matched_fileset_id}. State fullmatch."
+                            # create_log(escape_string(category_text), user, escape_string(log_text), conn)
+        finally:
+            conn.close()
+            
+                        
+            # if matched_list[0][1] == len(fileset["rom"]):
+            #     # full match
+            #     matched_fileset_id = matched_list[0][0]
+            #     # replace all th
+            # query = f"SELECT status FROM fileset WHERE id = {matched_fileset_id}"
+            # with conn.cursor() as cursor:
+            #     cursor.execute(query)
+            #     status = cursor.fetchone()['status']
+            # if status == 'detection':
+            #     # check if the fileset is a full match
+            #     query = f"SELECT file.id FROM file WHERE fileset = {matched_fileset_id}"
+            #     cursor.execute(query)
+            #     reusult = cursor.fetchall()
+                
+            #     file_ids = [file['id'] for file in reusult]
+            #     query = f"SELECT file, checksum, checksize, checktype FROM filechecksum WHERE file IN ({','.join(map(str, file_ids))})"
+            #     cursor.execute(query)
+            #     checksums = cursor.fetchall()
+                
+            #     checksum_dict = {}
+                
+            #     for checksum in checksums:
+            #         if checksum['checksize'] != 0:
+            #             key = f"{checksum['checktype']}-{checksum['checksize']}"
+            #             if checksum['file'] not in checksum_dict:
+            #                 checksum_dict[checksum['file']] = {}
+            #             checksum_dict[checksum['file']][key] = checksum['checksum']
                         
\ No newline at end of file


Commit: 9ab382695f214a3f69320f196f2ace32652de5bf
    https://github.com/scummvm/scummvm-sites/commit/9ab382695f214a3f69320f196f2ace32652de5bf
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-01T19:23:46+08:00

Commit Message:
INTEGRITY: Improve the regex

Changed paths:
    dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
index 9eef11a..c5799a1 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -14,18 +14,27 @@ def remove_quotes(string):
 def map_checksum_data(content_string):
     arr = []
     
-    rom_props = re.findall(r'(\w+)\s+"([^"]*)"\s+size\s+(\d+)((?:\s+md5(?:-\w+)?(?:-\w+)?\s+[a-f0-9]+)*)', content_string)
-
-    for prop in rom_props:
-        key, name, size, md5s_str = prop
-        item = {'name': name, 'size': int(size)}
-
-        md5s = re.findall(r'(md5(?:-\w+)?(?:-\w+)?)\s+([a-f0-9]+)', md5s_str)
-        for md5_key, md5_value in md5s:
-            item[md5_key] = md5_value
-        
-        arr.append(item)
-
+    content_string = content_string.strip().strip('()').strip()
+    
+    tokens = re.split(r'\s+(?=(?:[^"]*"[^"]*")*[^"]*$)', content_string)
+    
+    current_rom = {}
+    i = 0
+    while i < len(tokens):
+        if tokens[i] == 'name':
+            current_rom['name'] = tokens[i + 1].strip('"')
+            i += 2
+        elif tokens[i] == 'size':
+            current_rom['size'] = int(tokens[i + 1])
+            i += 2
+        else:
+            checksum_key = tokens[i]
+            checksum_value = tokens[i + 1]
+            current_rom[checksum_key] = checksum_value
+            i += 2
+    
+    arr.append(current_rom)
+    
     return arr
 
 def map_key_values(content_string, arr):


Commit: db34d079d4f65f7f21c7b3cda48c56535609cd7c
    https://github.com/scummvm/scummvm-sites/commit/db34d079d4f65f7f21c7b3cda48c56535609cd7c
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-01T19:24:18+08:00

Commit Message:
INTEGRITY: Add skiplog option to the dat_parser

Changed paths:
    dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
index c5799a1..e63ed7f 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -125,6 +125,7 @@ def main():
     parser.add_argument('--match', nargs='+', help='Populate matching games in the database')
     parser.add_argument('--user', help='Username for database')
     parser.add_argument('-r', help="Recurse through directories", action='store_true')
+    parser.add_argument('--skiplog', help="Skip logging dups", action='store_true')
 
     args = parser.parse_args()
 


Commit: 9b09e2dae5074e68df50822bcbd57c2bbf932fdc
    https://github.com/scummvm/scummvm-sites/commit/9b09e2dae5074e68df50822bcbd57c2bbf932fdc
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-01T19:25:34+08:00

Commit Message:
INTEGRITY: Handle the automatic merge of scan

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 849616e..003df75 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -482,9 +482,16 @@ def match_fileset(data_arr, username=None):
     
     src = "dat" if author not in ["scan", "scummvm"] else author
     detection = (src == "scummvm")
-    status = "detection" if detection else src
+    source_status = "detection" if detection else src
     user = f'cli:{getpass.getuser()}' if username is None else username
-    
+
+    with conn.cursor() as cursor:
+        cursor.execute("SELECT MAX(`transaction`) FROM transactions")
+        transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
+
+    category_text = f"Uploaded from {src}"
+    log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Transaction: {transaction_id}"
+
     for fileset in game_data:
         if detection:
             engine_name = fileset["engine"]
@@ -502,7 +509,7 @@ def match_fileset(data_arr, username=None):
 
         key = calc_key(fileset) if not detection else ""
         megakey = calc_megakey(fileset) if detection else ""
-        log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}."
+        log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}."
 
         matched_map = defaultdict(int)
         try:
@@ -535,75 +542,136 @@ def match_fileset(data_arr, username=None):
                         cursor.execute(query)
                         status = cursor.fetchone()['status']
 
-                    if status == 'scan':
                         query = f"SELECT COUNT(file.id) FROM file WHERE fileset = {matched_fileset_id}" 
                         cursor.execute(query)
                         count = cursor.fetchone()['COUNT(file.id)']
+
+                        # if matched fileset's status is detection
+                        if status == 'detection':
+                                
+                            if count == matched_count:
+                        
+                                # full match
+                                cursor.execute(f"""
+                                            UPDATE fileset SET 
+                                                    status = 'fullmatch', 
+                                                    `timestamp` = FROM_UNIXTIME({int(time.time())})
+                                                WHERE id = {matched_fileset_id}""")
+                                cursor.execute(f"SELECT * FROM file WHERE fileset = {matched_fileset_id}")
+                                target_files = cursor.fetchall()
+                                
+                                target_files_dict = {}
+                                for target_file in target_files:
+                                    cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
+                                    target_checksums = cursor.fetchall()
+                                    for checksum in target_checksums:
+                                        target_files_dict[checksum['checksum']] = target_file
+                                for file in fileset['rom']:
+                                    file_exists = False
+                                    for key, value in file.items():
+                                        print(key, value)
+                                        if key not in ["name", "size"]:
+                                            scan_md5type = key
+                                            scan_checksum = value
+                                            if scan_checksum in target_files_dict.keys():
+                                                file_exists = True
+                                                cursor.execute(f"DELETE FROM file WHERE id = {target_files_dict[scan_checksum]['id']}")
+                                                break
+                                    print(file_exists)
+                                    cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{scan_checksum}', {matched_fileset_id}, {0})")
+                                    # TODO: insert filechecksum
+                                                
+                                # log
+                                category_text = f"Matched from {src}"
+                                log_text = f"Matched game {matched_fileset_id}. State fullmatch."
+                                create_log(escape_string(category_text), user, escape_string(log_text), conn)
                             
-                        if count == matched_count:
-                            # full match
-                            cursor.execute(f"""
-                                           UPDATE fileset SET 
-                                                status = 'fullmatch', 
-                                                `timestamp` = FROM_UNIXTIME({int(time.time())})
-                                            WHERE id = {matched_fileset_id}""")
-                            cursor.execute(f"SELECT * FROM file WHERE fileset = {matched_fileset_id}")
-                            target_files = cursor.fetchall()
+                            else:
+                                # not a full match
+                                if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=username):
+                                    for file in fileset["rom"]:
+                                        insert_file(file, detection, src, conn)
+                                        for key, value in file.items():
+                                            if key not in ["name", "size"]:
+                                                insert_filechecksum(file, key, conn)
+                        elif status == 'full':
+                            # if it's a dup
+                            if len(fileset['rom']) == count:
+                                # TODO: log the dup msg
+                                return
+                        
+                        elif status == 'partial':
+                            # same as 'detection'
+                            if count == matched_count:
+                        
+                                # full match
+                                cursor.execute(f"""
+                                            UPDATE fileset SET 
+                                                    status = 'fullmatch', 
+                                                    `timestamp` = FROM_UNIXTIME({int(time.time())})
+                                                WHERE id = {matched_fileset_id}""")
+                                cursor.execute(f"SELECT * FROM file WHERE fileset = {matched_fileset_id}")
+                                target_files = cursor.fetchall()
+                                
+                                target_files_dict = {}
+                                for target_file in target_files:
+                                    cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
+                                    target_checksums = cursor.fetchall()
+                                    for checksum in target_checksums:
+                                        target_files_dict[checksum['checksum']] = target_file
+                                for file in fileset['rom']:
+                                    file_exists = False
+                                    for key, value in file.items():
+                                        print(key, value)
+                                        if key not in ["name", "size"]:
+                                            scan_md5type = key
+                                            scan_checksum = value
+                                            if scan_checksum in target_files_dict.keys():
+                                                file_exists = True
+                                                cursor.execute(f"DELETE FROM file WHERE id = {target_files_dict[scan_checksum]['id']}")
+                                                break
+                                    print(file_exists)
+                                    cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{scan_checksum}', {matched_fileset_id}, {0})")
+                                    # TODO: insert filechecksum
+                                                
+                                # log
+                                category_text = f"Matched from {src}"
+                                log_text = f"Matched game {matched_fileset_id}. State fullmatch."
+                                create_log(escape_string(category_text), user, escape_string(log_text), conn)
                             
-                            target_files_dict = {}
-                            for target_file in target_files:
-                                cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
-                                target_checksums = cursor.fetchall()
-                                for checksum in target_checksums:
-                                    target_files_dict[checksum['checksum']] = target_file
-                            for file in fileset['rom']:
-                                file_exists = False
-                                for key, value in file.items():
-                                    print(key, value)
-                                    if key not in ["name", "size"]:
-                                        scan_md5type = key
-                                        scan_checksum = value
-                                        if scan_checksum in target_files_dict.keys():
-                                            file_exists = True
-                                            cursor.execute(f"DELETE FROM file WHERE id = {target_files_dict[scan_checksum]['id']}")
-                                            break
-                                print(file_exists)
-                                cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{scan_checksum}', {matched_fileset_id}, {0})")
-                                # TODO: insert filechecksum
-                                              
-                            # log
-                            category_text = f"Matched from {src}"
-                            log_text = f"Matched game {matched_fileset_id}. State fullmatch."
-                            # create_log(escape_string(category_text), user, escape_string(log_text), conn)
+                            else:
+                                # not a full match
+                                if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=username):
+                                    for file in fileset["rom"]:
+                                        insert_file(file, detection, src, conn)
+                                        for key, value in file.items():
+                                            if key not in ["name", "size"]:
+                                                insert_filechecksum(file, key, conn)
+
+
+
+                else:
+                    # no match
+                    if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=username):
+                        for file in fileset["rom"]:
+                            insert_file(file, detection, src, conn)
+                            for key, value in file.items():
+                                if key not in ["name", "size"]:
+                                    insert_filechecksum(file, key, conn)
+            if detection:
+                conn.cursor().execute("UPDATE fileset SET status = 'obsolete' WHERE `timestamp` != FROM_UNIXTIME(@fileset_time_last) AND status = 'detection'")
+            cur = conn.cursor()
+            
+            try:
+                cur.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
+                fileset_insertion_count = cur.fetchone()['COUNT(fileset)']
+                category_text = f"Uploaded from {src}"
+                log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
+            except Exception as e:
+                print("Inserting failed:", e)
+            else:
+                user = f'cli:{getpass.getuser()}' if username is None else username
+                create_log(escape_string(category_text), user, escape_string(log_text), conn)            
         finally:
             conn.close()
-            
-                        
-            # if matched_list[0][1] == len(fileset["rom"]):
-            #     # full match
-            #     matched_fileset_id = matched_list[0][0]
-            #     # replace all th
-            # query = f"SELECT status FROM fileset WHERE id = {matched_fileset_id}"
-            # with conn.cursor() as cursor:
-            #     cursor.execute(query)
-            #     status = cursor.fetchone()['status']
-            # if status == 'detection':
-            #     # check if the fileset is a full match
-            #     query = f"SELECT file.id FROM file WHERE fileset = {matched_fileset_id}"
-            #     cursor.execute(query)
-            #     reusult = cursor.fetchall()
-                
-            #     file_ids = [file['id'] for file in reusult]
-            #     query = f"SELECT file, checksum, checksize, checktype FROM filechecksum WHERE file IN ({','.join(map(str, file_ids))})"
-            #     cursor.execute(query)
-            #     checksums = cursor.fetchall()
-                
-            #     checksum_dict = {}
-                
-            #     for checksum in checksums:
-            #         if checksum['checksize'] != 0:
-            #             key = f"{checksum['checktype']}-{checksum['checksize']}"
-            #             if checksum['file'] not in checksum_dict:
-            #                 checksum_dict[checksum['file']] = {}
-            #             checksum_dict[checksum['file']][key] = checksum['checksum']
-                        
\ No newline at end of file
+            
\ No newline at end of file


Commit: 7e1c4f8ad6dd66ca3d0fac25e68f45fa151cb78d
    https://github.com/scummvm/scummvm-sites/commit/7e1c4f8ad6dd66ca3d0fac25e68f45fa151cb78d
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-02T19:30:42+08:00

Commit Message:
INTEGRITY: Add clear.py for testing

Changed paths:
  A clear.py


diff --git a/clear.py b/clear.py
new file mode 100644
index 0000000..523481e
--- /dev/null
+++ b/clear.py
@@ -0,0 +1,49 @@
+"""
+This script deletes all data from the tables in the database.
+Using it when testing the data insertion.
+"""
+
+import pymysql
+import json
+
+def delete_all_data(conn):
+    tables = ["filechecksum", "queue", "history", "transactions", "file", "fileset", "game", "engine", "log"]
+    cursor = conn.cursor()
+    
+    for table in tables:
+        try:
+            cursor.execute(f"DELETE FROM {table}")
+            print(f"Table '{table}' data deleted successfully")
+        except pymysql.Error as err:
+            print(f"Error deleting data from table '{table}': {err}")
+
+if __name__ == "__main__":
+    with open(__file__ + '/../mysql_config.json') as f:
+        mysql_cred = json.load(f)
+
+    servername = mysql_cred["servername"]
+    username = mysql_cred["username"]
+    password = mysql_cred["password"]
+    dbname = mysql_cred["dbname"]
+
+    # Create connection
+    conn = pymysql.connect(
+        host=servername,
+        user=username,
+        password=password,
+        db=dbname,  # Specify the database to use
+        charset='utf8mb4',
+        cursorclass=pymysql.cursors.DictCursor,
+        autocommit=True
+    )
+
+    # Check connection
+    if conn is None:
+        print("Error connecting to MySQL")
+        exit(1)
+
+    # Delete all data from tables
+    delete_all_data(conn)
+
+    # Close connection
+    conn.close()
\ No newline at end of file


Commit: 3424b932a8ac7d1df7a4bafbeb8cb98df04c6472
    https://github.com/scummvm/scummvm-sites/commit/3424b932a8ac7d1df7a4bafbeb8cb98df04c6472
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-02T19:31:12+08:00

Commit Message:
INTEGRITY: Handle special cases for dat_parser

Changed paths:
    dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
index e63ed7f..5dd206f 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -29,7 +29,7 @@ def map_checksum_data(content_string):
             i += 2
         else:
             checksum_key = tokens[i]
-            checksum_value = tokens[i + 1]
+            checksum_value = tokens[i + 1] if len(tokens) >= 6 else "0"
             current_rom[checksum_key] = checksum_value
             i += 2
     


Commit: 1e2ba0fa57fd0700e908c08990dd55bfa87a1d78
    https://github.com/scummvm/scummvm-sites/commit/1e2ba0fa57fd0700e908c08990dd55bfa87a1d78
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-02T19:32:20+08:00

Commit Message:
INTEGRITY: Fix bugs of auto merging

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 003df75..d080f8a 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -199,8 +199,7 @@ def calc_megakey(fileset):
     key_string = f":{fileset['platform']}:{fileset['language']}"
     for file in fileset['rom']:
         for key, value in file.items():
-            if key != "name":
-                key_string += ':' + str(value)
+            key_string += ':' + str(value)
 
     key_string = key_string.strip(':')
     return hashlib.md5(key_string.encode()).hexdigest()
@@ -233,7 +232,10 @@ def db_insert(data_arr, username=None):
 
     with conn.cursor() as cursor:
         cursor.execute("SELECT MAX(`transaction`) FROM transactions")
-        transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
+        temp = cursor.fetchone()['MAX(`transaction`)']
+        if temp == None:
+            temp = 0
+        transaction_id = temp + 1
 
     category_text = f"Uploaded from {src}"
     log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}. Transaction: {transaction_id}"
@@ -483,7 +485,8 @@ def match_fileset(data_arr, username=None):
     src = "dat" if author not in ["scan", "scummvm"] else author
     detection = (src == "scummvm")
     source_status = "detection" if detection else src
-    user = f'cli:{getpass.getuser()}' if username is None else username
+
+    conn.cursor().execute(f"SET @fileset_time_last = {int(time.time())}")
 
     with conn.cursor() as cursor:
         cursor.execute("SELECT MAX(`transaction`) FROM transactions")
@@ -491,6 +494,9 @@ def match_fileset(data_arr, username=None):
 
     category_text = f"Uploaded from {src}"
     log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Transaction: {transaction_id}"
+    
+    user = f'cli:{getpass.getuser()}' if username is None else username
+    create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
     for fileset in game_data:
         if detection:
@@ -517,26 +523,30 @@ def match_fileset(data_arr, username=None):
                 for file in fileset["rom"]:
                     for key, value in file.items():
                         if key not in ["name", "size"]:
-                            md5type = key
-                            checksum = value
+                            checksum = file[key]
+                            checktype = key
+                            checksize, checktype, checksum = get_checksum_props(checktype, checksum)
                             query = f"""SELECT DISTINCT fs.id AS fileset_id
                                         FROM fileset fs
                                         JOIN file f ON fs.id = f.fileset
                                         JOIN filechecksum fc ON f.id = fc.file
-                                        WHERE fc.checksum = '{checksum}' AND fc.checktype = '{md5type}'
-                                        AND fs.status IN ('detection', 'dat', 'scan', 'partialmatch', 'fullmatch')"""
+                                        WHERE fc.checksum = '{checksum}' AND fc.checktype = '{checktype}'
+                                        AND fs.status IN ('detection', 'dat', 'scan', 'partial', 'full', 'obsolete')"""
 
                             
                             cursor.execute(query)
                             records = cursor.fetchall()
-                            # print(records)
                             if records:
                                 for record in records:
                                     matched_map[record['fileset_id']] += 1
+                                break
                 
                 matched_list = sorted(matched_map.items(), key=lambda x: x[1], reverse=True)
                 if matched_list:
+                    is_full_matched = False
                     for matched_fileset_id, matched_count in matched_list:
+                        if is_full_matched:
+                            break
                         query = f"SELECT status FROM fileset WHERE id = {matched_fileset_id}"
 
                         cursor.execute(query)
@@ -547,14 +557,14 @@ def match_fileset(data_arr, username=None):
                         count = cursor.fetchone()['COUNT(file.id)']
 
                         # if matched fileset's status is detection
-                        if status == 'detection':
+                        if status == 'detection' or status == 'obsolete':
                                 
                             if count == matched_count:
-                        
                                 # full match
+                                is_full_matched = True
                                 cursor.execute(f"""
                                             UPDATE fileset SET 
-                                                    status = 'fullmatch', 
+                                                    status = 'full', 
                                                     `timestamp` = FROM_UNIXTIME({int(time.time())})
                                                 WHERE id = {matched_fileset_id}""")
                                 cursor.execute(f"SELECT * FROM file WHERE fileset = {matched_fileset_id}")
@@ -568,22 +578,21 @@ def match_fileset(data_arr, username=None):
                                         target_files_dict[checksum['checksum']] = target_file
                                 for file in fileset['rom']:
                                     file_exists = False
+                                    cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['md5']}', {matched_fileset_id}, {0})")
+                                    cursor.execute("SET @file_last = LAST_INSERT_ID()")
                                     for key, value in file.items():
-                                        print(key, value)
                                         if key not in ["name", "size"]:
+                                            insert_filechecksum(file, key, conn)
                                             scan_md5type = key
                                             scan_checksum = value
                                             if scan_checksum in target_files_dict.keys():
                                                 file_exists = True
                                                 cursor.execute(f"DELETE FROM file WHERE id = {target_files_dict[scan_checksum]['id']}")
                                                 break
-                                    print(file_exists)
-                                    cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{scan_checksum}', {matched_fileset_id}, {0})")
-                                    # TODO: insert filechecksum
                                                 
                                 # log
                                 category_text = f"Matched from {src}"
-                                log_text = f"Matched game {matched_fileset_id}. State fullmatch."
+                                log_text = f"Matched Fileset:{matched_fileset_id}. State full."
                                 create_log(escape_string(category_text), user, escape_string(log_text), conn)
                             
                             else:
@@ -598,6 +607,9 @@ def match_fileset(data_arr, username=None):
                             # if it's a dup
                             if len(fileset['rom']) == count:
                                 # TODO: log the dup msg
+                                category_text = f"Matched from {src}"
+                                log_text = f"Dup Fileset:{matched_fileset_id}. State full."
+                                create_log(escape_string(category_text), user, escape_string(log_text), conn)
                                 return
                         
                         elif status == 'partial':
@@ -666,12 +678,11 @@ def match_fileset(data_arr, username=None):
                 cur.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
                 fileset_insertion_count = cur.fetchone()['COUNT(fileset)']
                 category_text = f"Uploaded from {src}"
-                log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
+                log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
             except Exception as e:
                 print("Inserting failed:", e)
             else:
                 user = f'cli:{getpass.getuser()}' if username is None else username
                 create_log(escape_string(category_text), user, escape_string(log_text), conn)            
         finally:
-            conn.close()
-            
\ No newline at end of file
+            conn.close()
\ No newline at end of file


Commit: f48eb94f4ad9dfc7193d6ffd4b4293105b7dda85
    https://github.com/scummvm/scummvm-sites/commit/f48eb94f4ad9dfc7193d6ffd4b4293105b7dda85
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-03T20:13:34+08:00

Commit Message:
INTEGRITY: Update the detection_type and detection when merging

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index d080f8a..f31b4cb 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -556,109 +556,72 @@ def match_fileset(data_arr, username=None):
                         cursor.execute(query)
                         count = cursor.fetchone()['COUNT(file.id)']
 
-                        # if matched fileset's status is detection
-                        if status == 'detection' or status == 'obsolete':
-                                
-                            if count == matched_count:
-                                # full match
-                                is_full_matched = True
-                                cursor.execute(f"""
-                                            UPDATE fileset SET 
-                                                    status = 'full', 
-                                                    `timestamp` = FROM_UNIXTIME({int(time.time())})
-                                                WHERE id = {matched_fileset_id}""")
-                                cursor.execute(f"SELECT * FROM file WHERE fileset = {matched_fileset_id}")
-                                target_files = cursor.fetchall()
-                                
-                                target_files_dict = {}
-                                for target_file in target_files:
-                                    cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
-                                    target_checksums = cursor.fetchall()
-                                    for checksum in target_checksums:
-                                        target_files_dict[checksum['checksum']] = target_file
-                                for file in fileset['rom']:
-                                    file_exists = False
-                                    cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['md5']}', {matched_fileset_id}, {0})")
-                                    cursor.execute("SET @file_last = LAST_INSERT_ID()")
-                                    for key, value in file.items():
-                                        if key not in ["name", "size"]:
-                                            insert_filechecksum(file, key, conn)
-                                            scan_md5type = key
-                                            scan_checksum = value
-                                            if scan_checksum in target_files_dict.keys():
-                                                file_exists = True
-                                                cursor.execute(f"DELETE FROM file WHERE id = {target_files_dict[scan_checksum]['id']}")
-                                                break
-                                                
-                                # log
-                                category_text = f"Matched from {src}"
-                                log_text = f"Matched Fileset:{matched_fileset_id}. State full."
-                                create_log(escape_string(category_text), user, escape_string(log_text), conn)
-                            
-                            else:
-                                # not a full match
-                                if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=username):
-                                    for file in fileset["rom"]:
-                                        insert_file(file, detection, src, conn)
-                                        for key, value in file.items():
-                                            if key not in ["name", "size"]:
-                                                insert_filechecksum(file, key, conn)
-                        elif status == 'full':
-                            # if it's a dup
-                            if len(fileset['rom']) == count:
-                                # TODO: log the dup msg
-                                category_text = f"Matched from {src}"
-                                log_text = f"Dup Fileset:{matched_fileset_id}. State full."
-                                create_log(escape_string(category_text), user, escape_string(log_text), conn)
-                                return
-                        
-                        elif status == 'partial':
-                            # same as 'detection'
-                            if count == matched_count:
-                        
-                                # full match
-                                cursor.execute(f"""
-                                            UPDATE fileset SET 
-                                                    status = 'fullmatch', 
-                                                    `timestamp` = FROM_UNIXTIME({int(time.time())})
-                                                WHERE id = {matched_fileset_id}""")
-                                cursor.execute(f"SELECT * FROM file WHERE fileset = {matched_fileset_id}")
-                                target_files = cursor.fetchall()
-                                
-                                target_files_dict = {}
-                                for target_file in target_files:
-                                    cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
-                                    target_checksums = cursor.fetchall()
-                                    for checksum in target_checksums:
-                                        target_files_dict[checksum['checksum']] = target_file
-                                for file in fileset['rom']:
-                                    file_exists = False
-                                    for key, value in file.items():
-                                        print(key, value)
-                                        if key not in ["name", "size"]:
-                                            scan_md5type = key
-                                            scan_checksum = value
-                                            if scan_checksum in target_files_dict.keys():
-                                                file_exists = True
-                                                cursor.execute(f"DELETE FROM file WHERE id = {target_files_dict[scan_checksum]['id']}")
-                                                break
-                                    print(file_exists)
-                                    cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{scan_checksum}', {matched_fileset_id}, {0})")
-                                    # TODO: insert filechecksum
-                                                
-                                # log
-                                category_text = f"Matched from {src}"
-                                log_text = f"Matched game {matched_fileset_id}. State fullmatch."
-                                create_log(escape_string(category_text), user, escape_string(log_text), conn)
-                            
-                            else:
-                                # not a full match
-                                if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=username):
-                                    for file in fileset["rom"]:
-                                        insert_file(file, detection, src, conn)
-                                        for key, value in file.items():
-                                            if key not in ["name", "size"]:
-                                                insert_filechecksum(file, key, conn)
+            if status in ['detection', 'obsolete'] and count == matched_count:
+                is_full_matched = True
+                update_fileset_status(cursor, matched_fileset_id, 'full')
+                insert_files(fileset, matched_fileset_id, conn, detection)
+                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+            elif status == 'full' and len(fileset['rom']) == count:
+                is_full_matched == True
+                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+                return
+            elif status == 'partial' and count == matched_count:
+                update_fileset_status(cursor, matched_fileset_id, 'full')
+                insert_files(fileset, matched_fileset_id, conn, detection)
+                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+            elif status == 'scan' and count == matched_count:
+                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+                return
+            else:
+                insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
+
+def update_fileset_status(cursor, fileset_id, status):
+    cursor.execute(f"""
+        UPDATE fileset SET 
+            status = '{status}', 
+            `timestamp` = FROM_UNIXTIME({int(time.time())})
+        WHERE id = {fileset_id}
+    """)
+
+def insert_files(fileset, fileset_id, conn, detection):
+    with conn.cursor() as cursor:
+        cursor.execute(f"SELECT * FROM file WHERE fileset = {fileset_id}")
+        target_files = cursor.fetchall()
+        target_files_dict = {}
+        for target_file in target_files:
+            cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
+            target_checksums = cursor.fetchall()
+            for checksum in target_checksums:
+                target_files_dict[checksum['checksum']] = target_file
+                target_files_dict[target_file['id']] = f"{checksum['checktype']}-{checksum['checksize']}"
+        for file in fileset['rom']:
+            file_exists = False
+            cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['md5']}', {fileset_id}, {0})")
+            cursor.execute("SET @file_last = LAST_INSERT_ID()")
+            cursor.execute("SELECT @file_last AS file_id")
+            file_id = cursor.fetchone()['file_id']
+            target_id = None
+            for key, value in file.items():
+                if key not in ["name", "size"]:
+                    insert_filechecksum(file, key, conn)
+                    if value in target_files_dict and not file_exists:
+                        file_exists = True
+                        target_id = target_files_dict[value]['id']
+                        cursor.execute(f"DELETE FROM file WHERE id = {target_files_dict[value]['id']}")
+            
+            if file_exists:
+                cursor.execute(f"UPDATE file SET detection = 1 WHERE id = {file_id}")
+                cursor.execute(f"UPDATE file SET detection_type = '{target_files_dict[target_id]}' WHERE id = {file_id}")
+            else:
+                cursor.execute(f"UPDATE file SET detection_type = 'None' WHERE id = {file_id}")
+
+def insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user):
+    if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=user):
+        for file in fileset["rom"]:
+            insert_file(file, detection, src, conn)
+            for key, value in file.items():
+                if key not in ["name", "size"]:
+                    insert_filechecksum(file, key, conn)
 
 
 


Commit: 19f667cb08c3009b934aa9d7ed9d3d8f9e46137e
    https://github.com/scummvm/scummvm-sites/commit/19f667cb08c3009b934aa9d7ed9d3d8f9e46137e
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-03T20:14:26+08:00

Commit Message:
INTEGRITY: Add 'detection_type' column to 'file' table

Changed paths:
    db_functions.py
    fileset.py
    schema.py


diff --git a/db_functions.py b/db_functions.py
index f31b4cb..f09428c 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -137,7 +137,9 @@ def insert_file(file, detection, src, conn):
                 checksize, checktype, checksum = get_checksum_props(key, value)
                 break
 
-    query = f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection})"
+    if not detection:
+        checktype = "None"
+    query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{checktype}-{checksize}')"
     with conn.cursor() as cursor:
         cursor.execute(query)
 
@@ -464,10 +466,7 @@ def populate_matching_games():
             print("Updating matched games failed")
             
 def match_fileset(data_arr, username=None):
-    header = data_arr[0]
-    game_data = data_arr[1]
-    resources = data_arr[2]
-    filepath = data_arr[3]
+    header, game_data, resources, filepath = data_arr
 
     try:
         conn = db_connect()
@@ -481,7 +480,7 @@ def match_fileset(data_arr, username=None):
     except KeyError as e:
         print(f"Missing key in header: {e}")
         return
-    
+
     src = "dat" if author not in ["scan", "scummvm"] else author
     detection = (src == "scummvm")
     source_status = "detection" if detection else src
@@ -494,67 +493,75 @@ def match_fileset(data_arr, username=None):
 
     category_text = f"Uploaded from {src}"
     log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Transaction: {transaction_id}"
-    
+
     user = f'cli:{getpass.getuser()}' if username is None else username
     create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
     for fileset in game_data:
-        if detection:
-            engine_name = fileset["engine"]
-            engineid = fileset["sourcefile"]
-            gameid = fileset["name"]
-            title = fileset["title"]
-            extra = fileset["extra"]
-            platform = fileset["platform"]
-            lang = fileset["language"]
+        process_fileset(fileset, resources, detection, src, conn, transaction_id, filepath, author, version, source_status, user)
+    finalize_fileset_insertion(conn, transaction_id, src, filepath, author, version, source_status, user)
 
-            insert_game(engine_name, engineid, title, gameid, extra, platform, lang, conn)
-        elif src == "dat":
-            if 'romof' in fileset and fileset['romof'] in resources:
-                fileset["rom"] = fileset["rom"] + resources[fileset["romof"]]["rom"]
+def process_fileset(fileset, resources, detection, src, conn, transaction_id, filepath, author, version, source_status, user):
+    if detection:
+        insert_game_data(fileset, conn)
+    elif src == "dat" and 'romof' in fileset and fileset['romof'] in resources:
+        fileset["rom"] += resources[fileset["romof"]]["rom"]
 
-        key = calc_key(fileset) if not detection else ""
-        megakey = calc_megakey(fileset) if detection else ""
-        log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}."
+    key = calc_key(fileset) if not detection else ""
+    megakey = calc_megakey(fileset) if detection else ""
+    log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}."
 
-        matched_map = defaultdict(int)
-        try:
-            with conn.cursor() as cursor:   
-                for file in fileset["rom"]:
-                    for key, value in file.items():
-                        if key not in ["name", "size"]:
-                            checksum = file[key]
-                            checktype = key
-                            checksize, checktype, checksum = get_checksum_props(checktype, checksum)
-                            query = f"""SELECT DISTINCT fs.id AS fileset_id
-                                        FROM fileset fs
-                                        JOIN file f ON fs.id = f.fileset
-                                        JOIN filechecksum fc ON f.id = fc.file
-                                        WHERE fc.checksum = '{checksum}' AND fc.checktype = '{checktype}'
-                                        AND fs.status IN ('detection', 'dat', 'scan', 'partial', 'full', 'obsolete')"""
-
-                            
-                            cursor.execute(query)
-                            records = cursor.fetchall()
-                            if records:
-                                for record in records:
-                                    matched_map[record['fileset_id']] += 1
-                                break
-                
-                matched_list = sorted(matched_map.items(), key=lambda x: x[1], reverse=True)
-                if matched_list:
-                    is_full_matched = False
-                    for matched_fileset_id, matched_count in matched_list:
-                        if is_full_matched:
-                            break
-                        query = f"SELECT status FROM fileset WHERE id = {matched_fileset_id}"
-
-                        cursor.execute(query)
-                        status = cursor.fetchone()['status']
-
-                        query = f"SELECT COUNT(file.id) FROM file WHERE fileset = {matched_fileset_id}" 
-                        cursor.execute(query)
-                        count = cursor.fetchone()['COUNT(file.id)']
+    matched_map = find_matching_filesets(fileset, conn)
+
+    if matched_map:
+        handle_matched_filesets(matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
+    else:
+        insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
+
+def insert_game_data(fileset, conn):
+    engine_name = fileset["engine"]
+    engineid = fileset["sourcefile"]
+    gameid = fileset["name"]
+    title = fileset["title"]
+    extra = fileset["extra"]
+    platform = fileset["platform"]
+    lang = fileset["language"]
+    insert_game(engine_name, engineid, title, gameid, extra, platform, lang, conn)
+
+def find_matching_filesets(fileset, conn):
+    matched_map = defaultdict(int)
+    with conn.cursor() as cursor:
+        for file in fileset["rom"]:
+            for key, value in file.items():
+                if key not in ["name", "size"]:
+                    checksum = file[key]
+                    checktype = key
+                    checksize, checktype, checksum = get_checksum_props(checktype, checksum)
+                    query = f"""SELECT DISTINCT fs.id AS fileset_id
+                                FROM fileset fs
+                                JOIN file f ON fs.id = f.fileset
+                                JOIN filechecksum fc ON f.id = fc.file
+                                WHERE fc.checksum = '{checksum}' AND fc.checktype = '{checktype}'
+                                AND fs.status IN ('detection', 'dat', 'scan', 'partial', 'full', 'obsolete')"""
+                    cursor.execute(query)
+                    records = cursor.fetchall()
+                    if records:
+                        for record in records:
+                            matched_map[record['fileset_id']] += 1
+                        break
+    return matched_map
+
+def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user):
+    matched_list = sorted(matched_map.items(), key=lambda x: x[1], reverse=True)
+    is_full_matched = False
+    with conn.cursor() as cursor:
+        for matched_fileset_id, matched_count in matched_list:
+            if is_full_matched:
+                break
+            cursor.execute(f"SELECT status FROM fileset WHERE id = {matched_fileset_id}")
+            status = cursor.fetchone()['status']
+            cursor.execute(f"SELECT COUNT(file.id) FROM file WHERE fileset = {matched_fileset_id}")
+            count = cursor.fetchone()['COUNT(file.id)']
 
             if status in ['detection', 'obsolete'] and count == matched_count:
                 is_full_matched = True
@@ -623,29 +630,16 @@ def insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_
                 if key not in ["name", "size"]:
                     insert_filechecksum(file, key, conn)
 
+def log_matched_fileset(src, fileset_id, state, user, conn):
+    category_text = f"Matched from {src}"
+    log_text = f"Matched Fileset:{fileset_id}. State {state}."
+    create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
-
-                else:
-                    # no match
-                    if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=username):
-                        for file in fileset["rom"]:
-                            insert_file(file, detection, src, conn)
-                            for key, value in file.items():
-                                if key not in ["name", "size"]:
-                                    insert_filechecksum(file, key, conn)
-            if detection:
-                conn.cursor().execute("UPDATE fileset SET status = 'obsolete' WHERE `timestamp` != FROM_UNIXTIME(@fileset_time_last) AND status = 'detection'")
-            cur = conn.cursor()
-            
-            try:
-                cur.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
-                fileset_insertion_count = cur.fetchone()['COUNT(fileset)']
-                category_text = f"Uploaded from {src}"
-                log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
-            except Exception as e:
-                print("Inserting failed:", e)
-            else:
-                user = f'cli:{getpass.getuser()}' if username is None else username
-                create_log(escape_string(category_text), user, escape_string(log_text), conn)            
-        finally:
-            conn.close()
\ No newline at end of file
+def finalize_fileset_insertion(conn, transaction_id, src, filepath, author, version, source_status, user):
+    with conn.cursor() as cursor:
+        cursor.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
+        fileset_insertion_count = cursor.fetchone()['COUNT(fileset)']
+        category_text = f"Uploaded from {src}"
+        log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
+        create_log(escape_string(category_text), user, escape_string(log_text), conn)
+    conn.close()
diff --git a/fileset.py b/fileset.py
index 33a2f02..732f59f 100644
--- a/fileset.py
+++ b/fileset.py
@@ -145,7 +145,7 @@ def fileset():
             # Table
             html += "<table>\n"
 
-            cursor.execute(f"SELECT file.id, name, size, checksum, detection FROM file WHERE fileset = {id}")
+            cursor.execute(f"SELECT file.id, name, size, checksum, detection, detection_type FROM file WHERE fileset = {id}")
             result = cursor.fetchall()
 
             all_columns = list(result[0].keys()) if result else []
diff --git a/schema.py b/schema.py
index 5a85d55..ce09200 100644
--- a/schema.py
+++ b/schema.py
@@ -148,6 +148,12 @@ indices = {
     "fileset": "CREATE INDEX fileset ON history (fileset)"
 }
 
+try:
+    cursor.execute("ALTER TABLE file ADD COLUMN detection_type VARCHAR(20);")
+except:
+    # if aleady exists, change the length of the column
+    cursor.execute("ALTER TABLE file MODIFY COLUMN detection_type VARCHAR(20);")
+
 for index, definition in indices.items():
     try:
         cursor.execute(definition)
@@ -196,7 +202,7 @@ def insert_random_data():
         cursor.execute("INSERT INTO transactions (`transaction`, fileset) VALUES (%s, %s)", 
                        (random.randint(1, 100), 1))
 # for testing locally
-insert_random_data()
+# insert_random_data()
 
 conn.commit()
 conn.close()
\ No newline at end of file


Commit: e75afd99855a7640e1ebe28188c46132c5bce6bf
    https://github.com/scummvm/scummvm-sites/commit/e75afd99855a7640e1ebe28188c46132c5bce6bf
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-05T20:06:06+08:00

Commit Message:
INTEGRITY: Clear the counters of DB

Changed paths:
    clear.py


diff --git a/clear.py b/clear.py
index 523481e..34b5c6f 100644
--- a/clear.py
+++ b/clear.py
@@ -1,21 +1,27 @@
 """
-This script deletes all data from the tables in the database.
+This script deletes all data from the tables in the database and resets auto-increment counters.
 Using it when testing the data insertion.
 """
 
 import pymysql
 import json
 
-def delete_all_data(conn):
+def truncate_all_tables(conn):
     tables = ["filechecksum", "queue", "history", "transactions", "file", "fileset", "game", "engine", "log"]
     cursor = conn.cursor()
     
+    # Disable foreign key checks
+    cursor.execute("SET FOREIGN_KEY_CHECKS = 0")
+    
     for table in tables:
         try:
-            cursor.execute(f"DELETE FROM {table}")
-            print(f"Table '{table}' data deleted successfully")
+            cursor.execute(f"TRUNCATE TABLE `{table}`")
+            print(f"Table '{table}' truncated successfully")
         except pymysql.Error as err:
-            print(f"Error deleting data from table '{table}': {err}")
+            print(f"Error truncating table '{table}': {err}")
+    
+    # Enable foreign key checks
+    cursor.execute("SET FOREIGN_KEY_CHECKS = 1")
 
 if __name__ == "__main__":
     with open(__file__ + '/../mysql_config.json') as f:
@@ -42,8 +48,8 @@ if __name__ == "__main__":
         print("Error connecting to MySQL")
         exit(1)
 
-    # Delete all data from tables
-    delete_all_data(conn)
+    # Truncate all tables
+    truncate_all_tables(conn)
 
     # Close connection
     conn.close()
\ No newline at end of file


Commit: 3f57f0990ee0e55530c9306edcb27899828ad744
    https://github.com/scummvm/scummvm-sites/commit/3f57f0990ee0e55530c9306edcb27899828ad744
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-05T20:06:31+08:00

Commit Message:
INTEGRITY: Add --skiplog option to dat parser

Changed paths:
    dat_parser.py


diff --git a/dat_parser.py b/dat_parser.py
index 5dd206f..11a57b2 100644
--- a/dat_parser.py
+++ b/dat_parser.py
@@ -131,11 +131,11 @@ def main():
 
     if args.upload:
         for filepath in args.upload:
-            db_insert(parse_dat(filepath), args.user)
+            db_insert(parse_dat(filepath), args.user, args.skiplog)
 
     if args.match:
         for filepath in args.match:
             match_fileset(parse_dat(filepath), args.user)
 
 if __name__ == "__main__":
-    main()
\ No newline at end of file
+    main()


Commit: 56c370c8c7efee81d03c4a03663a6905f89aa531
    https://github.com/scummvm/scummvm-sites/commit/56c370c8c7efee81d03c4a03663a6905f89aa531
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-05T20:07:39+08:00

Commit Message:
INTEGRITY: Insert set.dat to DB

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index f09428c..0ed47a6 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -66,7 +66,7 @@ def insert_game(engine_name, engineid, title, gameid, extra, platform, lang, con
         cursor.execute(f"INSERT INTO game (name, engine, gameid, extra, platform, language) VALUES ('{escape_string(title)}', @engine_last, '{gameid}', '{escape_string(extra)}', '{platform}', '{lang}')")
         cursor.execute("SET @game_last = LAST_INSERT_ID()")
 
-def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip='', username=None):
+def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip='', username=None, skiplog=None):
     status = "detection" if detection else src
     game = "NULL"
     key = "NULL" if key == "" else f"'{key}'"
@@ -99,7 +99,8 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         category_text = f"Updated Fileset:{existing_entry}"
         log_text = f"Updated Fileset:{existing_entry}, {log_text}"
         user = f'cli:{getpass.getuser()}' if username is None else username
-        create_log(escape_string(category_text), user, escape_string(log_text), conn)
+        if not skiplog:
+            create_log(escape_string(category_text), user, escape_string(log_text), conn)
 
         return True
 
@@ -119,7 +120,8 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         log_text = f"Created Fileset:{fileset_last}, from user IP {ip}, {log_text}"
 
     user = f'cli:{getpass.getuser()}' if username is None else username
-    create_log(escape_string(category_text), user, escape_string(log_text), conn)
+    if not skiplog:
+        create_log(escape_string(category_text), user, escape_string(log_text), conn)
     with conn.cursor() as cursor:
         cursor.execute(f"INSERT INTO transactions (`transaction`, fileset) VALUES ({transaction}, {fileset_last})")
 
@@ -206,7 +208,7 @@ def calc_megakey(fileset):
     key_string = key_string.strip(':')
     return hashlib.md5(key_string.encode()).hexdigest()
 
-def db_insert(data_arr, username=None):
+def db_insert(data_arr, username=None, skiplog=False):
     header = data_arr[0]
     game_data = data_arr[1]
     resources = data_arr[2]
@@ -264,7 +266,7 @@ def db_insert(data_arr, username=None):
         megakey = calc_megakey(fileset) if detection else ""
         log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {status}."
 
-        if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=username):
+        if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=username, skiplog=skiplog):
             for file in fileset["rom"]:
                 insert_file(file, detection, src, conn)
                 for key, value in file.items():
@@ -510,8 +512,10 @@ def process_fileset(fileset, resources, detection, src, conn, transaction_id, fi
     key = calc_key(fileset) if not detection else ""
     megakey = calc_megakey(fileset) if detection else ""
     log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}."
-
-    matched_map = find_matching_filesets(fileset, conn)
+    if src != "dat":
+        matched_map = find_matching_filesets(fileset, conn)
+    else:
+        matched_map = matching_set(fileset, conn)
 
     if matched_map:
         handle_matched_filesets(matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
@@ -551,6 +555,30 @@ def find_matching_filesets(fileset, conn):
                         break
     return matched_map
 
+def matching_set(fileset, conn):
+    matched_map = defaultdict(int)
+    with conn.cursor() as cursor:
+        for file in fileset["rom"]:
+            if "md5" in file:
+                checksum = file["md5"]
+                size = file["size"]
+                query = f"""
+                    SELECT DISTINCT fs.id AS fileset_id
+                    FROM fileset fs
+                    JOIN file f ON fs.id = f.fileset
+                    JOIN filechecksum fc ON f.id = fc.file
+                    WHERE fc.checksum = '{checksum}' AND fc.checktype = 'md5'
+                    AND f.size > {size}
+                    AND fs.status = 'detection'
+                """
+                cursor.execute(query)
+                records = cursor.fetchall()
+                if records:
+                    for record in records:
+                        matched_map[record['fileset_id']] += 1
+                    break
+    return matched_map
+
 def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user):
     matched_list = sorted(matched_map.items(), key=lambda x: x[1], reverse=True)
     is_full_matched = False
@@ -565,9 +593,9 @@ def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, meg
 
             if status in ['detection', 'obsolete'] and count == matched_count:
                 is_full_matched = True
-                update_fileset_status(cursor, matched_fileset_id, 'full')
+                update_fileset_status(cursor, matched_fileset_id, 'full' if src != "dat" else "partial")
                 insert_files(fileset, matched_fileset_id, conn, detection)
-                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+                log_matched_fileset(src, matched_fileset_id, 'full' if src != "dat" else "partial", user, conn)
             elif status == 'full' and len(fileset['rom']) == count:
                 is_full_matched == True
                 log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
@@ -579,6 +607,8 @@ def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, meg
             elif status == 'scan' and count == matched_count:
                 log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
                 return
+            elif src == 'dat':
+                log_matched_fileset(src, matched_fileset_id, 'partial matched', user, conn)
             else:
                 insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
 


Commit: 9d6b4ed5d7a600ada5e4dcc5b9cfcba04403b0c0
    https://github.com/scummvm/scummvm-sites/commit/9d6b4ed5d7a600ada5e4dcc5b9cfcba04403b0c0
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-08T19:26:20+08:00

Commit Message:
INTEGRITY: Show candidates and add merge button to fileset page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 732f59f..da4d371 100644
--- a/fileset.py
+++ b/fileset.py
@@ -7,6 +7,8 @@ from user_fileset_functions import user_calc_key, file_json_to_array, user_inser
 from pagination import create_page
 import difflib
 from pymysql.converters import escape_string
+from db_functions import find_matching_filesets
+from collections import defaultdict
 
 app = Flask(__name__)
 
@@ -103,6 +105,7 @@ def fileset():
         <table>
         """
             html += f"<td><button onclick=\"location.href='/fileset/{id}/merge'\">Merge</button></td>"
+            html += f"<td><button onclick=\"location.href='/fileset/{id}/match'\">Match</button></td>"
 
             cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
             result = cursor.fetchone()
@@ -233,6 +236,91 @@ def fileset():
             return render_template_string(html)
     finally:
         connection.close()
+
+ at app.route('/fileset/<int:id>/match', methods=['GET'])
+def match_fileset_route(id):
+    with open('mysql_config.json') as f:
+        mysql_cred = json.load(f)
+
+    connection = pymysql.connect(host=mysql_cred["servername"],
+                                 user=mysql_cred["username"],
+                                 password=mysql_cred["password"],
+                                 db=mysql_cred["dbname"],
+                                 charset='utf8mb4',
+                                 cursorclass=pymysql.cursors.DictCursor)
+
+    try:
+        with connection.cursor() as cursor:
+            cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
+            fileset = cursor.fetchone()
+            fileset['rom'] = []
+            if not fileset:
+                return f"No fileset found with id {id}", 404
+
+            cursor.execute(f"SELECT file.id, name, size, checksum, detection, detection_type FROM file WHERE fileset = {id}")
+            result = cursor.fetchall()
+            file_ids = {}
+            for file in result:
+                file_ids[file['id']] = (file['name'], file['size'])
+            cursor.execute(f"SELECT file, checksum, checksize, checktype FROM filechecksum WHERE file IN ({','.join(map(str, file_ids.keys()))})")
+            
+            files = cursor.fetchall()
+            checksum_dict = defaultdict(list)
+            print(files)
+            for i in files:
+                checksum_dict[file_ids[i["file"]][0]].append((i["checksum"], i["checksize"], i["checktype"]))
+            print(checksum_dict)
+            for i in files:
+                temp_dict = {}
+                temp_dict["name"] = file_ids[i["file"]][0]
+                temp_dict["size"] = file_ids[i["file"]][1]
+                for checksum in checksum_dict[temp_dict["name"]]:
+                    temp_dict[f"{checksum[2]}-{checksum[1]}"] = checksum[0]
+                fileset["rom"].append(temp_dict)
+
+            matched_map = find_matching_filesets(fileset, connection)
+
+            html = f"""
+            <!DOCTYPE html>
+            <html>
+            <head>
+                <link rel="stylesheet" type="text/css" href="{{{{ url_for('static', filename='style.css') }}}}">
+            </head>
+            <body>
+            <h2>Matched Filesets for Fileset: {id}</h2>
+            <table>
+            <tr>
+                <th>Fileset ID</th>
+                <th>Match Count</th>
+                <th>Actions</th>
+            </tr>
+            """
+
+            for fileset_id, match_count in matched_map.items():
+                html += f"""
+                <tr>
+                    <td>{fileset_id}</td>
+                    <td>{match_count}</td>
+                    <td><a href="/fileset?id={fileset_id}">View Details</a></td>
+                    <td>
+                        <form method="POST" action="/fileset/{id}/merge/confirm">
+                            <input type="hidden" name="source_id" value="{id}">
+                            <input type="hidden" name="target_id" value="{fileset_id}">
+                            <input type="submit" value="Merge">
+                        </form>
+                    </td>
+                    <td>
+                        <form method="GET" action="/fileset?id={id}">
+                            <input type="submit" value="Cancel">
+                        </form>
+                    </td>
+                </tr>
+                """
+
+            html += "</table></body></html>"
+            return render_template_string(html)
+    finally:
+        connection.close()
         
 @app.route('/fileset/<int:id>/merge', methods=['GET', 'POST'])
 def merge_fileset(id):
@@ -322,7 +410,7 @@ def merge_fileset(id):
     
 @app.route('/fileset/<int:id>/merge/confirm', methods=['GET', 'POST'])
 def confirm_merge(id):
-    target_id = request.args.get('target_id', type=int)
+    target_id = request.args.get('target_id', type=int) if request.method == 'GET' else request.form.get('target_id')
 
     with open('mysql_config.json') as f:
         mysql_cred = json.load(f)


Commit: 7cab1a4d9235dfd2ac08b5d1f0a2759cb9c28ade
    https://github.com/scummvm/scummvm-sites/commit/7cab1a4d9235dfd2ac08b5d1f0a2759cb9c28ade
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-09T20:14:13+08:00

Commit Message:
INTEGRITY: Add history func

Changed paths:
    db_functions.py
    fileset.py


diff --git a/db_functions.py b/db_functions.py
index 0ed47a6..c936365 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -100,8 +100,9 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         log_text = f"Updated Fileset:{existing_entry}, {log_text}"
         user = f'cli:{getpass.getuser()}' if username is None else username
         if not skiplog:
-            create_log(escape_string(category_text), user, escape_string(log_text), conn)
-
+            log_last = create_log(escape_string(category_text), user, escape_string(log_text), conn)
+            update_history(existing_entry, existing_entry, conn, log_last)
+            
         return True
 
     # $game and $key should not be parsed as a mysql string, hence no quotes
@@ -121,7 +122,8 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
 
     user = f'cli:{getpass.getuser()}' if username is None else username
     if not skiplog:
-        create_log(escape_string(category_text), user, escape_string(log_text), conn)
+        log_last = create_log(escape_string(category_text), user, escape_string(log_text), conn)
+        update_history(fileset_last, fileset_last, conn, log_last)
     with conn.cursor() as cursor:
         cursor.execute(f"INSERT INTO transactions (`transaction`, fileset) VALUES ({transaction}, {fileset_last})")
 
@@ -183,6 +185,21 @@ def create_log(category, user, text, conn):
             log_last = cursor.fetchone()['LAST_INSERT_ID()']
     return log_last
 
+def update_history(source_id, target_id, conn, log_last=None):
+    query = f"INSERT INTO history (`timestamp`, fileset, oldfileset, log) VALUES (NOW(), {target_id}, {source_id}, {log_last})"
+    with conn.cursor() as cursor:
+        try:
+            cursor.execute(query)
+            conn.commit()
+        except Exception as e:
+            conn.rollback()
+            print(f"Creating log failed: {e}")
+            log_last = None
+        else:
+            cursor.execute("SELECT LAST_INSERT_ID()")
+            log_last = cursor.fetchone()['LAST_INSERT_ID()']
+    return log_last
+
 def calc_key(fileset):
     key_string = ""
 
@@ -663,7 +680,8 @@ def insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_
 def log_matched_fileset(src, fileset_id, state, user, conn):
     category_text = f"Matched from {src}"
     log_text = f"Matched Fileset:{fileset_id}. State {state}."
-    create_log(escape_string(category_text), user, escape_string(log_text), conn)
+    log_last = create_log(escape_string(category_text), user, escape_string(log_text), conn)
+    update_history(fileset_id, fileset_id, conn, log_last)
 
 def finalize_fileset_insertion(conn, transaction_id, src, filepath, author, version, source_status, user):
     with conn.cursor() as cursor:
diff --git a/fileset.py b/fileset.py
index da4d371..0b2cf8c 100644
--- a/fileset.py
+++ b/fileset.py
@@ -7,7 +7,7 @@ from user_fileset_functions import user_calc_key, file_json_to_array, user_inser
 from pagination import create_page
 import difflib
 from pymysql.converters import escape_string
-from db_functions import find_matching_filesets
+from db_functions import find_matching_filesets, update_history
 from collections import defaultdict
 
 app = Flask(__name__)
@@ -210,6 +210,8 @@ def fileset():
 
             # Generate the HTML for the fileset history
             cursor.execute(f"SELECT `timestamp`, category, `text`, id FROM log WHERE `text` REGEXP 'Fileset:{id}' ORDER BY `timestamp` DESC, id DESC")
+            # cursor.execute(f"SELECT `timestamp`, fileset, oldfileset FROM history WHERE fileset = {id} ORDER BY `timestamp` DESC")
+            
             logs = cursor.fetchall()
 
             html += "<h3>Fileset history</h3>"
@@ -220,18 +222,29 @@ def fileset():
             html += "<th>Log ID</th>\n"
             cursor.execute("SELECT * FROM history")
             history = cursor.fetchall()
-
+            print(f"History: {history}")
             oldfilesets = [history_row['oldfileset'] for history_row in history]
             cursor.execute(f"""SELECT `timestamp`, category, `text`, id FROM log WHERE `text` LIKE 'Fileset:%' AND `category` NOT LIKE 'merge%' AND `text` REGEXP 'Fileset:({"|".join(map(str, oldfilesets))})' ORDER BY `timestamp` DESC, id DESC""")
             logs = cursor.fetchall()
-
-            for log in logs:
+            
+            for h in history:
+                cursor.execute(f"SELECT `timestamp`, category, `text`, id FROM log WHERE `text` LIKE 'Fileset:{h['oldfileset']}' ORDER BY `timestamp` DESC, id DESC")
+                # logs.extend(cursor.fetchall())
+                print(f"Logs: {logs}")
                 html += "<tr>\n"
-                html += f"<td>{log['timestamp']}</td>\n"
-                html += f"<td>{log['category']}</td>\n"
-                html += f"<td>{log['text']}</td>\n"
-                html += f"<td><a href='logs?id={log['id']}'>{log['id']}</a></td>\n"
+                html += f"<td>{h['timestamp']}</td>\n"
+                html += f"<td>merge</td>\n"
+                html += f"<td><a href='fileset?id={h['oldfileset']}'>Fileset {h['oldfileset']}</a> merged into fileset <a href='fileset?id={h['fileset']}'>Fileset {h['fileset']}</a></td>\n"
+                html += f"<td><a href='logs?id={h['id']}'>{h['id']}</a></td>\n"
                 html += "</tr>\n"
+
+            # for log in logs:
+            #     html += "<tr>\n"
+            #     html += f"<td>{log['timestamp']}</td>\n"
+            #     html += f"<td>{log['category']}</td>\n"
+            #     html += f"<td>{log['text']}</td>\n"
+            #     html += f"<td><a href='logs?id={log['id']}'>{log['id']}</a></td>\n"
+            #     html += "</tr>\n"
             html += "</table>\n"
             return render_template_string(html)
     finally:


Commit: 06271a9ff634c11ef1dc7d3ef52302964423e9e5
    https://github.com/scummvm/scummvm-sites/commit/06271a9ff634c11ef1dc7d3ef52302964423e9e5
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-10T20:24:12+08:00

Commit Message:
INTEGRITY: Implement history table

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 0b2cf8c..39ef02b 100644
--- a/fileset.py
+++ b/fileset.py
@@ -40,6 +40,7 @@ def index():
         <li><a href="{{ url_for('fileset') }}">Fileset</a></li>
         <li><a href="{{ url_for('user_games_list') }}">User Games List</a></li>
         <li><a href="{{ url_for('games_list') }}">Games List</a></li>
+        <li><a href="{{ url_for('fileset_search') }}">Fileset Search</a></li>
     </ul>
     <h2>Logs</h2>
     <ul>
@@ -52,8 +53,8 @@ def index():
 
 @app.route('/fileset', methods=['GET', 'POST'])
 def fileset():
-    id = request.args.get('id', default = 1, type = int)
-    widetable = request.args.get('widetable', default = 'false', type = str)
+    id = request.args.get('id', default=1, type=int)
+    widetable = request.args.get('widetable', default='false', type=str)
     # Load MySQL credentials from a JSON file
     with open('mysql_config.json') as f:
         mysql_cred = json.load(f)
@@ -95,15 +96,15 @@ def fileset():
 
             # Display fileset details
             html = f"""
-        <!DOCTYPE html>
-        <html>
-        <head>
-            <link rel="stylesheet" type="text/css" href="{{{{ url_for('static', filename='style.css') }}}}">
-        </head>
-        <body>
-        <h2><u>Fileset: {id}</u></h2>
-        <table>
-        """
+            <!DOCTYPE html>
+            <html>
+            <head>
+                <link rel="stylesheet" type="text/css" href="{{{{ url_for('static', filename='style.css') }}}}">
+            </head>
+            <body>
+            <h2><u>Fileset: {id}</u></h2>
+            <table>
+            """
             html += f"<td><button onclick=\"location.href='/fileset/{id}/merge'\">Merge</button></td>"
             html += f"<td><button onclick=\"location.href='/fileset/{id}/match'\">Match</button></td>"
 
@@ -219,32 +220,29 @@ def fileset():
             html += "<th>Timestamp</th>\n"
             html += "<th>Category</th>\n"
             html += "<th>Description</th>\n"
-            html += "<th>Log ID</th>\n"
-            cursor.execute("SELECT * FROM history")
+            html += "<th>Log Text</th>\n"
+
+            cursor.execute(f"SELECT * FROM history WHERE fileset = {id} OR oldfileset = {id}")
             history = cursor.fetchall()
             print(f"History: {history}")
-            oldfilesets = [history_row['oldfileset'] for history_row in history]
-            cursor.execute(f"""SELECT `timestamp`, category, `text`, id FROM log WHERE `text` LIKE 'Fileset:%' AND `category` NOT LIKE 'merge%' AND `text` REGEXP 'Fileset:({"|".join(map(str, oldfilesets))})' ORDER BY `timestamp` DESC, id DESC""")
-            logs = cursor.fetchall()
-            
+
             for h in history:
                 cursor.execute(f"SELECT `timestamp`, category, `text`, id FROM log WHERE `text` LIKE 'Fileset:{h['oldfileset']}' ORDER BY `timestamp` DESC, id DESC")
-                # logs.extend(cursor.fetchall())
+                logs = cursor.fetchall()
                 print(f"Logs: {logs}")
                 html += "<tr>\n"
                 html += f"<td>{h['timestamp']}</td>\n"
                 html += f"<td>merge</td>\n"
                 html += f"<td><a href='fileset?id={h['oldfileset']}'>Fileset {h['oldfileset']}</a> merged into fileset <a href='fileset?id={h['fileset']}'>Fileset {h['fileset']}</a></td>\n"
-                html += f"<td><a href='logs?id={h['id']}'>{h['id']}</a></td>\n"
+                # html += f"<td><a href='logs?id={h['log']}'>Log {h['log']}</a></td>\n"
+                if h['log']:
+                    cursor.execute(f"SELECT `text` FROM log WHERE id = {h['log']}")
+                    log_text = cursor.fetchone()['text']
+                    html += f"<td><a href='logs?id={h['log']}'>Log {h['log']}</a>{log_text}</td>\n"
+                else:
+                    html += "<td>No log available</td>\n"
                 html += "</tr>\n"
 
-            # for log in logs:
-            #     html += "<tr>\n"
-            #     html += f"<td>{log['timestamp']}</td>\n"
-            #     html += f"<td>{log['category']}</td>\n"
-            #     html += f"<td>{log['text']}</td>\n"
-            #     html += f"<td><a href='logs?id={log['id']}'>{log['id']}</a></td>\n"
-            #     html += "</tr>\n"
             html += "</table>\n"
             return render_template_string(html)
     finally:
@@ -752,6 +750,27 @@ def logs():
     }
     return render_template_string(create_page(filename, 25, records_table, select_query, order, filters))
 
+ at app.route('/fileset_search')
+def fileset_search():
+    filename = "fileset_search"
+    records_table = "fileset"
+    select_query = """
+    SELECT extra, platform, language, game.name, megakey,
+    status, fileset.id as fileset
+    FROM fileset
+    JOIN game ON game.id = fileset.game
+    """
+    order = "ORDER BY fileset.id"
+    filters = {
+        "fileset": "fileset",
+        "name": "game",
+        "extra": "game",
+        "platform": "game",
+        "language": "game",
+        "megakey": "fileset",
+        "status": "fileset"
+    }
+    return render_template_string(create_page(filename, 25, records_table, select_query, order, filters))
 
 if __name__ == '__main__':
     app.run()
\ No newline at end of file


Commit: c1fbcb85c40af1d2b4bc85f1893b3444fe18233d
    https://github.com/scummvm/scummvm-sites/commit/c1fbcb85c40af1d2b4bc85f1893b3444fe18233d
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-11T19:33:00+08:00

Commit Message:
INTEGRITY: Add hyperlinks to the log content

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 39ef02b..7bde35a 100644
--- a/fileset.py
+++ b/fileset.py
@@ -7,7 +7,7 @@ from user_fileset_functions import user_calc_key, file_json_to_array, user_inser
 from pagination import create_page
 import difflib
 from pymysql.converters import escape_string
-from db_functions import find_matching_filesets, update_history
+from db_functions import find_matching_filesets, get_all_related_filesets, convert_log_text_to_links
 from collections import defaultdict
 
 app = Flask(__name__)
@@ -105,8 +105,8 @@ def fileset():
             <h2><u>Fileset: {id}</u></h2>
             <table>
             """
-            html += f"<td><button onclick=\"location.href='/fileset/{id}/merge'\">Merge</button></td>"
-            html += f"<td><button onclick=\"location.href='/fileset/{id}/match'\">Match</button></td>"
+            html += f"<td><button onclick=\"location.href='/fileset/{id}/merge'\">Manual Merge</button></td>"
+            html += f"<td><button onclick=\"location.href='/fileset/{id}/match'\">Match and Merge</button></td>"
 
             cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
             result = cursor.fetchone()
@@ -222,7 +222,9 @@ def fileset():
             html += "<th>Description</th>\n"
             html += "<th>Log Text</th>\n"
 
-            cursor.execute(f"SELECT * FROM history WHERE fileset = {id} OR oldfileset = {id}")
+            related_filesets = get_all_related_filesets(id, conn)
+
+            cursor.execute(f"SELECT * FROM history WHERE fileset IN ({','.join(map(str, related_filesets))}) OR oldfileset IN ({','.join(map(str, related_filesets))})")
             history = cursor.fetchall()
             print(f"History: {history}")
 
@@ -238,7 +240,8 @@ def fileset():
                 if h['log']:
                     cursor.execute(f"SELECT `text` FROM log WHERE id = {h['log']}")
                     log_text = cursor.fetchone()['text']
-                    html += f"<td><a href='logs?id={h['log']}'>Log {h['log']}</a>{log_text}</td>\n"
+                    log_text = convert_log_text_to_links(log_text)
+                    html += f"<td><a href='logs?id={h['log']}'>Log {h['log']}</a>: {log_text}</td>\n"
                 else:
                     html += "<td>No log available</td>\n"
                 html += "</tr>\n"
@@ -308,6 +311,8 @@ def match_fileset_route(id):
             """
 
             for fileset_id, match_count in matched_map.items():
+                if fileset_id == id:
+                    continue
                 html += f"""
                 <tr>
                     <td>{fileset_id}</td>


Commit: a01002860b3d34996d0913da61e39c15b6bda6ff
    https://github.com/scummvm/scummvm-sites/commit/a01002860b3d34996d0913da61e39c15b6bda6ff
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-11T19:33:36+08:00

Commit Message:
INTEGRITY: Recursively query the fileset logs

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index c936365..a28483b 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -7,6 +7,7 @@ import hashlib
 import os
 from pymysql.converters import escape_string
 from collections import defaultdict
+import re
 
 def db_connect():
     with open('mysql_config.json') as f:
@@ -118,12 +119,14 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
 
     log_text = f"Created Fileset:{fileset_last}, {log_text}"
     if src == 'user':
-        log_text = f"Created Fileset:{fileset_last}, from user IP {ip}, {log_text}"
+        log_text = f"Created Fileset:{fileset_last}, from user: IP {ip}, {log_text}"
 
     user = f'cli:{getpass.getuser()}' if username is None else username
     if not skiplog:
         log_last = create_log(escape_string(category_text), user, escape_string(log_text), conn)
         update_history(fileset_last, fileset_last, conn, log_last)
+    else:
+        update_history(fileset_last, fileset_last, conn)
     with conn.cursor() as cursor:
         cursor.execute(f"INSERT INTO transactions (`transaction`, fileset) VALUES ({transaction}, {fileset_last})")
 
@@ -186,7 +189,7 @@ def create_log(category, user, text, conn):
     return log_last
 
 def update_history(source_id, target_id, conn, log_last=None):
-    query = f"INSERT INTO history (`timestamp`, fileset, oldfileset, log) VALUES (NOW(), {target_id}, {source_id}, {log_last})"
+    query = f"INSERT INTO history (`timestamp`, fileset, oldfileset, log) VALUES (NOW(), {target_id}, {source_id}, {log_last if log_last is not None else 0})"
     with conn.cursor() as cursor:
         try:
             cursor.execute(query)
@@ -200,6 +203,34 @@ def update_history(source_id, target_id, conn, log_last=None):
             log_last = cursor.fetchone()['LAST_INSERT_ID()']
     return log_last
 
+def get_all_related_filesets(fileset_id, conn, visited=None):
+    if visited is None:
+        visited = set()
+
+    if fileset_id in visited:
+        return []
+    
+    visited.add(fileset_id)
+
+    related_filesets = [fileset_id]
+    with conn.cursor() as cursor:
+        cursor.execute(f"SELECT fileset, oldfileset FROM history WHERE fileset = {fileset_id} OR oldfileset = {fileset_id}")
+        history_records = cursor.fetchall()
+
+    for record in history_records:
+        if record['fileset'] not in visited:
+            related_filesets.extend(get_all_related_filesets(record['fileset'], conn, visited))
+        if record['oldfileset'] not in visited:
+            related_filesets.extend(get_all_related_filesets(record['oldfileset'], conn, visited))
+
+    return related_filesets
+
+def convert_log_text_to_links(log_text):
+    log_text = re.sub(r'Fileset:(\d+)', r'<a href="/fileset?id=\1">Fileset:\1</a>', log_text)
+    log_text = re.sub(r'user:(\w+)', r'<a href="/log?search=user:\1">user:\1</a>', log_text)
+    log_text = re.sub(r'Transaction:(\d+)', r'<a href="/transaction?id=\1">Transaction:\1</a>', log_text)
+    return log_text
+
 def calc_key(fileset):
     key_string = ""
 
@@ -553,6 +584,7 @@ def find_matching_filesets(fileset, conn):
     matched_map = defaultdict(int)
     with conn.cursor() as cursor:
         for file in fileset["rom"]:
+            matched_set = set()
             for key, value in file.items():
                 if key not in ["name", "size"]:
                     checksum = file[key]
@@ -568,8 +600,12 @@ def find_matching_filesets(fileset, conn):
                     records = cursor.fetchall()
                     if records:
                         for record in records:
-                            matched_map[record['fileset_id']] += 1
-                        break
+                            matched_set.add(record['fileset_id'])
+
+            for id in matched_set:
+                matched_map[id] += 1
+                        
+    print(matched_map)
     return matched_map
 
 def matching_set(fileset, conn):


Commit: e4e861d3d278d55132132cbd41a72a8b01e4b06d
    https://github.com/scummvm/scummvm-sites/commit/e4e861d3d278d55132132cbd41a72a8b01e4b06d
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-12T20:18:49+08:00

Commit Message:
INTEGRITY: Fix duplicate count when displaying matched list

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 7bde35a..2d6536d 100644
--- a/fileset.py
+++ b/fileset.py
@@ -279,18 +279,24 @@ def match_fileset_route(id):
             cursor.execute(f"SELECT file, checksum, checksize, checktype FROM filechecksum WHERE file IN ({','.join(map(str, file_ids.keys()))})")
             
             files = cursor.fetchall()
-            checksum_dict = defaultdict(list)
-            print(files)
-            for i in files:
-                checksum_dict[file_ids[i["file"]][0]].append((i["checksum"], i["checksize"], i["checktype"]))
-            print(checksum_dict)
+            checksum_dict = defaultdict(lambda: {"name": "", "size": 0, "checksums": {}})
+
             for i in files:
-                temp_dict = {}
-                temp_dict["name"] = file_ids[i["file"]][0]
-                temp_dict["size"] = file_ids[i["file"]][1]
-                for checksum in checksum_dict[temp_dict["name"]]:
-                    temp_dict[f"{checksum[2]}-{checksum[1]}"] = checksum[0]
-                fileset["rom"].append(temp_dict)
+                file_id = i["file"]
+                file_name, file_size = file_ids[file_id]
+                checksum_dict[file_name]["name"] = file_name
+                checksum_dict[file_name]["size"] = file_size
+                checksum_key = f"{i['checktype']}-{i['checksize']}" if i['checksize'] != 0 else i['checktype']
+                checksum_dict[file_name]["checksums"][checksum_key] = i["checksum"]
+
+            fileset["rom"] = [
+                {
+                    "name": value["name"],
+                    "size": value["size"],
+                    **value["checksums"]
+                }
+                for value in checksum_dict.values()
+            ]
 
             matched_map = find_matching_filesets(fileset, connection)
 


Commit: d97df31fe018aa704a3746738440e0123042fb92
    https://github.com/scummvm/scummvm-sites/commit/d97df31fe018aa704a3746738440e0123042fb92
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-12T20:19:41+08:00

Commit Message:
INTEGRITY: Add user data check

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index a28483b..90c80db 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -727,3 +727,52 @@ def finalize_fileset_insertion(conn, transaction_id, src, filepath, author, vers
         log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
     conn.close()
+
+def find_user_match_filesets(fileset, conn):
+    matched_map = defaultdict(list)
+    with conn.cursor() as cursor:
+        for file in fileset["files"]:
+            matched_set = set()
+            for checksum_info in file["checksums"]:
+                checksum = checksum_info["checksum"]
+                checktype = checksum_info["type"]
+                checksize, checktype, checksum = get_checksum_props(checktype, checksum)
+                query = f"""SELECT DISTINCT fs.id AS fileset_id
+                                FROM fileset fs
+                                JOIN file f ON fs.id = f.fileset
+                                JOIN filechecksum fc ON f.id = fc.file
+                                WHERE fc.checksum = '{checksum}' AND fc.checktype = '{checktype}'
+                                AND fs.status IN ('detection', 'dat', 'scan', 'partial', 'full', 'obsolete')"""
+                cursor.execute(query)
+                records = cursor.fetchall()
+                if records:
+                    for record in records:
+                        matched_set.add(record['fileset_id'])
+            for id in matched_set:
+                matched_map[id] += 1
+                        
+    print(matched_map)
+    return matched_map
+
+def user_integrity_check(data):
+    print(data)
+    src = "user"
+    source_status = src
+    try:
+        conn = db_connect()
+    except Exception as e:
+        print(f"Failed to connect to database: {e}")
+        return
+    
+    conn.cursor().execute(f"SET @fileset_time_last = {int(time.time())}")
+
+    with conn.cursor() as cursor:
+        cursor.execute("SELECT MAX(`transaction`) FROM transactions")
+        transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
+
+    category_text = f"Uploaded from {src}"
+    log_text = f"Started loading file, State {source_status}. Transaction: {transaction_id}"
+
+    user = f'cli:{getpass.getuser()}'
+
+    create_log(escape_string(category_text), user, escape_string(log_text), conn)
\ No newline at end of file


Commit: 95bfa16077bb4f951f7f68c495cd6dec3a4b8d10
    https://github.com/scummvm/scummvm-sites/commit/95bfa16077bb4f951f7f68c495cd6dec3a4b8d10
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-13T17:58:31+08:00

Commit Message:
INTEGRITY: Add user integrity check interface

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 2d6536d..eed148e 100644
--- a/fileset.py
+++ b/fileset.py
@@ -1,4 +1,4 @@
-from flask import Flask, request, render_template, redirect, url_for, render_template_string, jsonify
+from flask import Flask, request, render_template, redirect, url_for, render_template_string, jsonify, flash
 import pymysql.cursors
 import json
 import re
@@ -7,7 +7,7 @@ from user_fileset_functions import user_calc_key, file_json_to_array, user_inser
 from pagination import create_page
 import difflib
 from pymysql.converters import escape_string
-from db_functions import find_matching_filesets, get_all_related_filesets, convert_log_text_to_links
+from db_functions import find_matching_filesets, get_all_related_filesets, convert_log_text_to_links, user_integrity_check
 from collections import defaultdict
 
 app = Flask(__name__)
@@ -783,5 +783,47 @@ def fileset_search():
     }
     return render_template_string(create_page(filename, 25, records_table, select_query, order, filters))
 
+ at app.route('/upload', methods=['GET'])
+def upload_page():
+    html = """
+    <!DOCTYPE html>
+    <html>
+    <head>
+        <title>Upload Game Integrity Check</title>
+    </head>
+    <body>
+        <h2>Upload Your Game Integrity Check (JSON)</h2>
+        <form action="/upload" method="post" enctype="multipart/form-data">
+            <input type="file" name="file" accept=".json" required>
+            <input type="submit" value="Upload">
+        </form>
+        {{ get_flashed_messages() }}
+    </body>
+    </html>
+    """
+    return render_template_string(html)
+
+ at app.route('/upload', methods=['POST'])
+def upload_file():
+    if 'file' not in request.files:
+        flash('No file part')
+        return redirect(request.url)
+    
+    file = request.files['file']
+    
+    if file.filename == '':
+        flash('No selected file')
+        return redirect(request.url)
+    
+    if file and file.filename.endswith('.json'):
+        try:
+            data = json.load(file)
+            ret = user_integrity_check(data)
+            flash('File successfully uploaded and processed')
+        except Exception as e:
+            flash(f'Error processing file: {e}')
+    
+    return redirect(url_for('upload_page'))
+
 if __name__ == '__main__':
     app.run()
\ No newline at end of file


Commit: 5191685df5f9038c9724bbf7bc2ab272ae8bc1a3
    https://github.com/scummvm/scummvm-sites/commit/5191685df5f9038c9724bbf7bc2ab272ae8bc1a3
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-15T19:55:06+08:00

Commit Message:
INTEGRITY: Fix wrong strcuture in matched_dict

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 90c80db..6ac91f1 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -729,7 +729,7 @@ def finalize_fileset_insertion(conn, transaction_id, src, filepath, author, vers
     conn.close()
 
 def find_user_match_filesets(fileset, conn):
-    matched_map = defaultdict(list)
+    matched_map = defaultdict(int)
     with conn.cursor() as cursor:
         for file in fileset["files"]:
             matched_set = set()


Commit: 5dc4045edda4b29fa74abf2ac45720230c72fd27
    https://github.com/scummvm/scummvm-sites/commit/5dc4045edda4b29fa74abf2ac45720230c72fd27
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-15T19:55:47+08:00

Commit Message:
INTEGRITY: Add extra_map and missing_map

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 6ac91f1..2a82125 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -766,13 +766,68 @@ def user_integrity_check(data):
     
     conn.cursor().execute(f"SET @fileset_time_last = {int(time.time())}")
 
-    with conn.cursor() as cursor:
-        cursor.execute("SELECT MAX(`transaction`) FROM transactions")
-        transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
+    try:
+        with conn.cursor() as cursor:
+            cursor.execute("SELECT MAX(`transaction`) FROM transactions")
+            transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
 
-    category_text = f"Uploaded from {src}"
-    log_text = f"Started loading file, State {source_status}. Transaction: {transaction_id}"
+            category_text = f"Uploaded from {src}"
+            log_text = f"Started loading file, State {source_status}. Transaction: {transaction_id}"
 
-    user = f'cli:{getpass.getuser()}'
+            user = f'cli:{getpass.getuser()}'
 
-    create_log(escape_string(category_text), user, escape_string(log_text), conn)
\ No newline at end of file
+            create_log(escape_string(category_text), user, escape_string(log_text), conn)
+            
+            matched_map= find_user_match_filesets(data, conn)
+            
+            # show matched, missing, extra
+            extra_map = defaultdict(list)
+            missing_map = defaultdict(list)
+            
+            for fileset_id in matched_map.keys():
+                cursor.execute(f"SELECT * FROM file WHERE fileset = {fileset_id}")
+                target_files = cursor.fetchall()
+                target_files_dict = {}
+                print(f"Target files: {target_files}")
+                for target_file in target_files:
+                    cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
+                    target_checksums = cursor.fetchall()
+                    for checksum in target_checksums:
+                        target_files_dict[checksum['checksum']] = target_file
+                        target_files_dict[target_file['id']] = f"{checksum['checktype']}-{checksum['checksize']}"
+                for file in data["files"]:
+                    file_exists = False
+                    for checksum_info in file["checksums"]:
+                        checksum = checksum_info["checksum"]
+                        checktype = checksum_info["type"]
+                        checksize, checktype, checksum = get_checksum_props(checktype, checksum)
+                        if checksum in target_files_dict and not file_exists:
+                            file_exists = True
+                            target_id = target_files_dict[checksum]['id']
+                    if not file_exists:
+                        missing_map[fileset_id].append(file)
+            
+            for file in data['files']:
+                file_exists = False
+                for checksum_info in file["checksums"]:
+                    checksum = checksum_info["checksum"]
+                    checktype = checksum_info["type"]
+                    checksize, checktype, checksum = get_checksum_props(checktype, checksum)
+                    if checksum in target_files_dict and not file_exists:
+                        file_exists = True
+                if not file_exists:
+                    extra_map[fileset_id].append(file)
+    
+    except Exception as e:
+        conn.rollback()
+        print(f"Error processing user data: {e}")
+    finally:
+        category_text = f"Uploaded from {src}"
+        log_text = f"Completed loading file, State {source_status}. Transaction: {transaction_id}"
+        create_log(escape_string(category_text), user, escape_string(log_text), conn)
+        conn.close()
+        
+    return matched_map, missing_map, extra_map
+                    
+        
+            
\ No newline at end of file


Commit: c8c8c58da9c5e6308d712f7a8f05049013d83add
    https://github.com/scummvm/scummvm-sites/commit/c8c8c58da9c5e6308d712f7a8f05049013d83add
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-15T19:56:34+08:00

Commit Message:
INTEGRITY: Improve the page of user upload

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index eed148e..0469104 100644
--- a/fileset.py
+++ b/fileset.py
@@ -12,6 +12,8 @@ from collections import defaultdict
 
 app = Flask(__name__)
 
+secret_key = os.urandom(24)
+
 with open('mysql_config.json') as f:
     mysql_cred = json.load(f)
 
@@ -818,12 +820,50 @@ def upload_file():
     if file and file.filename.endswith('.json'):
         try:
             data = json.load(file)
-            ret = user_integrity_check(data)
+            matched_map, missing_map, extra_map = user_integrity_check(data)
             flash('File successfully uploaded and processed')
         except Exception as e:
             flash(f'Error processing file: {e}')
+        finally:
+            html = """
+            <!DOCTYPE html>
+            <html>
+            <head>
+                <link rel="stylesheet" type="text/css" href="{{ url_for('static', filename='style.css') }}">
+            </head>
+            <body>
+            <h2>Upload Game Integrity Check</h2>
+            <body>
+                <h2>Upload Your Game Integrity Check (JSON)</h2>
+                <form action="/upload" method="post" enctype="multipart/form-data">
+                    <input type="file" name="file" accept=".json" required>
+                    <input type="submit" value="Upload">
+                </form>
+                <h2>Results</h2>
+                <h3>Matched Filesets</h3>
+                <ul>
+                {% for fileset_id, count in matched_map.items() %}
+                    <li>Fileset {{ fileset_id }}: {{ count }} matches</li>
+                {% endfor %}
+                </ul>
+                <h3>Missing Filesets</h3>
+                <ul>
+                {% for fileset_id, count in missing_map.items() %}
+                    <li>Fileset {{ fileset_id }}: {{ count }} missing</li>
+                {% endfor %}
+                </ul>
+                <h3>Extra Filesets</h3>
+                <ul>
+                {% for fileset_id, count in extra_map.items() %}
+                    <li>Fileset {{ fileset_id }}: {{ count }} extra</li>
+                {% endfor %}
+                </ul>
+            </body>
+            </html>
+            """
+        return render_template_string(html, matched_map=matched_map, missing_map=missing_map, extra_map=extra_map)
     
-    return redirect(url_for('upload_page'))
 
 if __name__ == '__main__':
-    app.run()
\ No newline at end of file
+    app.secret_key = secret_key
+    app.run(debug=True)


Commit: 2e5388dfc3690b3ba43b9e082dff5818343ff725
    https://github.com/scummvm/scummvm-sites/commit/2e5388dfc3690b3ba43b9e082dff5818343ff725
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-16T20:29:21+08:00

Commit Message:
INTEGRITY: Improve the user_integrity_check func

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 2a82125..13ae627 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -755,7 +755,6 @@ def find_user_match_filesets(fileset, conn):
     return matched_map
 
 def user_integrity_check(data):
-    print(data)
     src = "user"
     source_status = src
     try:
@@ -778,7 +777,7 @@ def user_integrity_check(data):
 
             create_log(escape_string(category_text), user, escape_string(log_text), conn)
             
-            matched_map= find_user_match_filesets(data, conn)
+            matched_map = find_user_match_filesets(data, conn)
             
             # show matched, missing, extra
             extra_map = defaultdict(list)
@@ -788,14 +787,30 @@ def user_integrity_check(data):
                 cursor.execute(f"SELECT * FROM file WHERE fileset = {fileset_id}")
                 target_files = cursor.fetchall()
                 target_files_dict = {}
-                print(f"Target files: {target_files}")
                 for target_file in target_files:
                     cursor.execute(f"SELECT * FROM filechecksum WHERE file = {target_file['id']}")
                     target_checksums = cursor.fetchall()
                     for checksum in target_checksums:
                         target_files_dict[checksum['checksum']] = target_file
-                        target_files_dict[target_file['id']] = f"{checksum['checktype']}-{checksum['checksize']}"
+                        # target_files_dict[target_file['id']] = f"{checksum['checktype']}-{checksum['checksize']}"
+                
+                # Collect all the checksums from data['files']
+                data_files_set = set()
                 for file in data["files"]:
+                    for checksum_info in file["checksums"]:
+                        checksum = checksum_info["checksum"]
+                        checktype = checksum_info["type"]
+                        checksize, checktype, checksum = get_checksum_props(checktype, checksum)
+                        data_files_set.add(checksum)
+                
+                # Identify missing files
+                for checksum, target_file in target_files_dict.items():
+                    if checksum not in data_files_set:
+                        
+                        missing_map[fileset_id].append(target_file)
+
+                # Identify extra files
+                for file in data['files']:
                     file_exists = False
                     for checksum_info in file["checksums"]:
                         checksum = checksum_info["checksum"]
@@ -803,20 +818,8 @@ def user_integrity_check(data):
                         checksize, checktype, checksum = get_checksum_props(checktype, checksum)
                         if checksum in target_files_dict and not file_exists:
                             file_exists = True
-                            target_id = target_files_dict[checksum]['id']
                     if not file_exists:
-                        missing_map[fileset_id].append(file)
-            
-            for file in data['files']:
-                file_exists = False
-                for checksum_info in file["checksums"]:
-                    checksum = checksum_info["checksum"]
-                    checktype = checksum_info["type"]
-                    checksize, checktype, checksum = get_checksum_props(checktype, checksum)
-                    if checksum in target_files_dict and not file_exists:
-                        file_exists = True
-                if not file_exists:
-                    extra_map[fileset_id].append(file)
+                        extra_map[fileset_id].append(file)
     
     except Exception as e:
         conn.rollback()
@@ -826,8 +829,6 @@ def user_integrity_check(data):
         log_text = f"Completed loading file, State {source_status}. Transaction: {transaction_id}"
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
         conn.close()
-        
+
     return matched_map, missing_map, extra_map
-                    
-        
-            
\ No newline at end of file
+


Commit: ee7bbb374a686a056733b69ebe2f7bbd549f9de4
    https://github.com/scummvm/scummvm-sites/commit/ee7bbb374a686a056733b69ebe2f7bbd549f9de4
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-16T20:30:53+08:00

Commit Message:
INTEGRITY: Enhance the user integrity check page rendering

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 0469104..2673c74 100644
--- a/fileset.py
+++ b/fileset.py
@@ -818,6 +818,9 @@ def upload_file():
         return redirect(request.url)
     
     if file and file.filename.endswith('.json'):
+        matched_map = {}
+        missing_map = {}
+        extra_map = {}
         try:
             data = json.load(file)
             matched_map, missing_map, extra_map = user_integrity_check(data)
@@ -832,37 +835,82 @@ def upload_file():
                 <link rel="stylesheet" type="text/css" href="{{ url_for('static', filename='style.css') }}">
             </head>
             <body>
-            <h2>Upload Game Integrity Check</h2>
-            <body>
-                <h2>Upload Your Game Integrity Check (JSON)</h2>
+                <h2>Upload Game Integrity Check</h2>
                 <form action="/upload" method="post" enctype="multipart/form-data">
                     <input type="file" name="file" accept=".json" required>
                     <input type="submit" value="Upload">
                 </form>
                 <h2>Results</h2>
                 <h3>Matched Filesets</h3>
-                <ul>
-                {% for fileset_id, count in matched_map.items() %}
-                    <li>Fileset {{ fileset_id }}: {{ count }} matches</li>
-                {% endfor %}
-                </ul>
+                <table>
+                <thead>
+                    <tr>
+                        <th>Fileset ID</th>
+                        <th>Match Count</th>
+                    </tr>
+                </thead>
+                <tbody>
+            """
+
+            for fileset_id, count in matched_map.items():
+                html += f"""
+                <tr>
+                    <td><a href='fileset?id={fileset_id}'>{fileset_id}</a></td>
+                    <td>{count}</td>
+                </tr>
+                """
+            
+            html += """
+                </tbody>
+                </table>
                 <h3>Missing Filesets</h3>
-                <ul>
-                {% for fileset_id, count in missing_map.items() %}
-                    <li>Fileset {{ fileset_id }}: {{ count }} missing</li>
-                {% endfor %}
-                </ul>
+                <table>
+                <thead>
+                    <tr>
+                        <th>Fileset ID</th>
+                        <th>Missing Count</th>
+                    </tr>
+                </thead>
+                <tbody>
+            """
+            
+            for fileset_id, count in missing_map.items():
+                html += f"""
+                <tr>
+                    <td><a href='fileset?id={fileset_id}'>{fileset_id}</a></td>
+                    <td>{len(count)}</td>
+                </tr>
+                """
+            
+            html += """
+                </tbody>
+                </table>
                 <h3>Extra Filesets</h3>
-                <ul>
-                {% for fileset_id, count in extra_map.items() %}
-                    <li>Fileset {{ fileset_id }}: {{ count }} extra</li>
-                {% endfor %}
-                </ul>
+                <table>
+                <thead>
+                    <tr>
+                        <th>Fileset ID</th>
+                        <th>Extra Count</th>
+                    </tr>
+                </thead>
+                <tbody>
+            """
+            
+            for fileset_id, count in extra_map.items():
+                html += f"""
+                <tr>
+                    <td><a href='fileset?id={fileset_id}'>{fileset_id}</a></td>
+                    <td>{len(count)}</td>
+                </tr>
+                """
+            
+            html += """
+                </tbody>
+                </table>
             </body>
             </html>
             """
-        return render_template_string(html, matched_map=matched_map, missing_map=missing_map, extra_map=extra_map)
-    
+        return render_template_string(html)
 
 if __name__ == '__main__':
     app.secret_key = secret_key


Commit: cf57605913284a4452e919d3eaa4bca2422d8100
    https://github.com/scummvm/scummvm-sites/commit/cf57605913284a4452e919d3eaa4bca2422d8100
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-17T20:11:18+08:00

Commit Message:
INTEGRITY: Add timestamp and user_count column

Changed paths:
    schema.py


diff --git a/schema.py b/schema.py
index ce09200..3333bb5 100644
--- a/schema.py
+++ b/schema.py
@@ -154,6 +154,18 @@ except:
     # if aleady exists, change the length of the column
     cursor.execute("ALTER TABLE file MODIFY COLUMN detection_type VARCHAR(20);")
 
+try:
+    cursor.execute("ALTER TABLE file ADD COLUMN `timestamp` TIMESTAMP NOT NULL;")
+except:
+    # if aleady exists, change the length of the column
+    cursor.execute("ALTER TABLE file MODIFY COLUMN `timestamp` TIMESTAMP NOT NULL;")
+
+try:
+    cursor.execute("ALTER TABLE fileset ADD COLUMN `user_count` INT;")
+except:
+    # if aleady exists, change the length of the column
+    cursor.execute("ALTER TABLE fileset MODIFY COLUMN `user_count` INT;")
+
 for index, definition in indices.items():
     try:
         cursor.execute(definition)


Commit: d3bb6ae594e3e8b1332ccdba78bb20561d555419
    https://github.com/scummvm/scummvm-sites/commit/d3bb6ae594e3e8b1332ccdba78bb20561d555419
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-17T20:12:35+08:00

Commit Message:
INTEGRITY: Insert the current time into the file table

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 2673c74..6f1b918 100644
--- a/fileset.py
+++ b/fileset.py
@@ -151,7 +151,7 @@ def fileset():
             # Table
             html += "<table>\n"
 
-            cursor.execute(f"SELECT file.id, name, size, checksum, detection, detection_type FROM file WHERE fileset = {id}")
+            cursor.execute(f"SELECT file.id, name, size, checksum, detection, detection_type, `timestamp` FROM file WHERE fileset = {id}")
             result = cursor.fetchall()
 
             all_columns = list(result[0].keys()) if result else []
@@ -300,7 +300,7 @@ def match_fileset_route(id):
                 for value in checksum_dict.values()
             ]
 
-            matched_map = find_matching_filesets(fileset, connection)
+            matched_map = find_matching_filesets(fileset, connection, fileset['status'])
 
             html = f"""
             <!DOCTYPE html>
@@ -581,8 +581,8 @@ def execute_merge(id, source=None, target=None):
 
                 for file in source_files:
                     cursor.execute(f"""
-                    INSERT INTO file (name, size, checksum, fileset, detection)
-                    VALUES ('{escape_string(file['name']).lower()}', '{file['size']}', '{file['checksum']}', {target_id}, {file['detection']})
+                    INSERT INTO file (name, size, checksum, fileset, detection, `timestamp`)
+                    VALUES ('{escape_string(file['name']).lower()}', '{file['size']}', '{file['checksum']}', {target_id}, {file['detection']}, NOW())
                     """)
 
                     cursor.execute("SELECT LAST_INSERT_ID() as file_id")
@@ -632,8 +632,8 @@ def execute_merge(id, source=None, target=None):
                             file_exists = True
                             break
                     print(file_exists)
-                    cursor.execute("INSERT INTO file (name, size, checksum, fileset, detection) VALUES (%s, %s, %s, %s, %s)",
-                                   (source_file['name'], source_file['size'], source_file['checksum'], target_id, source_file['detection']))
+                    cursor.execute(f"""INSERT INTO file (name, size, checksum, fileset, detection, `timestamp`) VALUES (
+                        '{source_file['name']}', '{source_file['size']}', '{source_file['checksum']}', {target_id}, {source_file['detection']}, NOW())""")
                     new_file_id = cursor.lastrowid
                     for checksum in source_checksums:
                         # TODO: Handle the string


Commit: e6c285484f892ca8d309efcfc271fa12b9d12bea
    https://github.com/scummvm/scummvm-sites/commit/e6c285484f892ca8d309efcfc271fa12b9d12bea
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-17T20:14:16+08:00

Commit Message:
INTEGRITY: Handle different scenarios of user uploads

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 13ae627..12f815b 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -146,7 +146,7 @@ def insert_file(file, detection, src, conn):
 
     if not detection:
         checktype = "None"
-    query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{checktype}-{checksize}')"
+    query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{checktype}-{checksize}', NOW())"
     with conn.cursor() as cursor:
         cursor.execute(query)
 
@@ -561,7 +561,7 @@ def process_fileset(fileset, resources, detection, src, conn, transaction_id, fi
     megakey = calc_megakey(fileset) if detection else ""
     log_text = f"size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}."
     if src != "dat":
-        matched_map = find_matching_filesets(fileset, conn)
+        matched_map = find_matching_filesets(fileset, conn, src)
     else:
         matched_map = matching_set(fileset, conn)
 
@@ -580,8 +580,12 @@ def insert_game_data(fileset, conn):
     lang = fileset["language"]
     insert_game(engine_name, engineid, title, gameid, extra, platform, lang, conn)
 
-def find_matching_filesets(fileset, conn):
+def find_matching_filesets(fileset, conn, status):
     matched_map = defaultdict(int)
+    if status != "user":
+        state = """'detection', 'dat', 'scan', 'partial', 'full', 'obsolete'"""
+    else:
+        state = """'user', 'partial', 'full'"""
     with conn.cursor() as cursor:
         for file in fileset["rom"]:
             matched_set = set()
@@ -595,7 +599,7 @@ def find_matching_filesets(fileset, conn):
                                 JOIN file f ON fs.id = f.fileset
                                 JOIN filechecksum fc ON f.id = fc.file
                                 WHERE fc.checksum = '{checksum}' AND fc.checktype = '{checktype}'
-                                AND fs.status IN ('detection', 'dat', 'scan', 'partial', 'full', 'obsolete')"""
+                                AND fs.status IN ({state})"""
                     cursor.execute(query)
                     records = cursor.fetchall()
                     if records:
@@ -647,7 +651,7 @@ def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, meg
             if status in ['detection', 'obsolete'] and count == matched_count:
                 is_full_matched = True
                 update_fileset_status(cursor, matched_fileset_id, 'full' if src != "dat" else "partial")
-                insert_files(fileset, matched_fileset_id, conn, detection)
+                populate_file(fileset, matched_fileset_id, conn, detection)
                 log_matched_fileset(src, matched_fileset_id, 'full' if src != "dat" else "partial", user, conn)
             elif status == 'full' and len(fileset['rom']) == count:
                 is_full_matched == True
@@ -655,7 +659,7 @@ def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, meg
                 return
             elif status == 'partial' and count == matched_count:
                 update_fileset_status(cursor, matched_fileset_id, 'full')
-                insert_files(fileset, matched_fileset_id, conn, detection)
+                populate_file(fileset, matched_fileset_id, conn, detection)
                 log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
             elif status == 'scan' and count == matched_count:
                 log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
@@ -673,7 +677,7 @@ def update_fileset_status(cursor, fileset_id, status):
         WHERE id = {fileset_id}
     """)
 
-def insert_files(fileset, fileset_id, conn, detection):
+def populate_file(fileset, fileset_id, conn, detection):
     with conn.cursor() as cursor:
         cursor.execute(f"SELECT * FROM file WHERE fileset = {fileset_id}")
         target_files = cursor.fetchall()
@@ -686,7 +690,7 @@ def insert_files(fileset, fileset_id, conn, detection):
                 target_files_dict[target_file['id']] = f"{checksum['checktype']}-{checksum['checksize']}"
         for file in fileset['rom']:
             file_exists = False
-            cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['md5']}', {fileset_id}, {0})")
+            cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['md5']}', {fileset_id}, {0}, NOW())")
             cursor.execute("SET @file_last = LAST_INSERT_ID()")
             cursor.execute("SELECT @file_last AS file_id")
             file_id = cursor.fetchone()['file_id']
@@ -728,35 +732,25 @@ def finalize_fileset_insertion(conn, transaction_id, src, filepath, author, vers
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
     conn.close()
 
-def find_user_match_filesets(fileset, conn):
-    matched_map = defaultdict(int)
-    with conn.cursor() as cursor:
-        for file in fileset["files"]:
-            matched_set = set()
-            for checksum_info in file["checksums"]:
-                checksum = checksum_info["checksum"]
-                checktype = checksum_info["type"]
-                checksize, checktype, checksum = get_checksum_props(checktype, checksum)
-                query = f"""SELECT DISTINCT fs.id AS fileset_id
-                                FROM fileset fs
-                                JOIN file f ON fs.id = f.fileset
-                                JOIN filechecksum fc ON f.id = fc.file
-                                WHERE fc.checksum = '{checksum}' AND fc.checktype = '{checktype}'
-                                AND fs.status IN ('detection', 'dat', 'scan', 'partial', 'full', 'obsolete')"""
-                cursor.execute(query)
-                records = cursor.fetchall()
-                if records:
-                    for record in records:
-                        matched_set.add(record['fileset_id'])
-            for id in matched_set:
-                matched_map[id] += 1
-                        
-    print(matched_map)
-    return matched_map
-
 def user_integrity_check(data):
     src = "user"
     source_status = src
+    new_files = []
+
+    for file in data["files"]:
+        new_file = {
+            "name": file["name"],
+            "size": file["size"]
+        }
+        for checksum in file["checksums"]:
+            checksum_type = checksum["type"]
+            checksum_value = checksum["checksum"]
+            new_file[checksum_type] = checksum_value
+
+        new_files.append(new_file)
+
+    data["rom"] = new_files
+    key = calc_key(data)
     try:
         conn = db_connect()
     except Exception as e:
@@ -777,7 +771,7 @@ def user_integrity_check(data):
 
             create_log(escape_string(category_text), user, escape_string(log_text), conn)
             
-            matched_map = find_user_match_filesets(data, conn)
+            matched_map = find_matching_filesets(data, conn, src)
             
             # show matched, missing, extra
             extra_map = defaultdict(list)
@@ -820,7 +814,26 @@ def user_integrity_check(data):
                             file_exists = True
                     if not file_exists:
                         extra_map[fileset_id].append(file)
-    
+            
+            # handle different scenarios
+            matched_list = sorted(matched_map.items(), key=lambda x: x[1], reverse=True)
+            most_matched = matched_list[0] 
+            matched_fileset_id, matched_count = most_matched[0], most_matched[1]   
+            cursor.execute(f"SELECT status FROM fileset WHERE id = {matched_fileset_id}")
+            status = cursor.fetchone()['status']
+
+            cursor.execute(f"SELECT COUNT(file.id) FROM file WHERE fileset = {matched_fileset_id}")
+            count = cursor.fetchone()['COUNT(file.id)']
+            if status == "full" and count == matched_count:
+                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+            elif status == "partial" and count == matched_count:
+                populate_file(data, matched_fileset_id, conn, None, src)
+                log_matched_fileset(src, matched_fileset_id, 'partial', user, conn)
+            elif status == "user" and count == matched_count:
+                pass
+            else:
+                insert_new_fileset(data, conn, None, src, key, None, transaction_id, log_text, user)
+            finalize_fileset_insertion(conn, transaction_id, src, None, user, 0, source_status, user)
     except Exception as e:
         conn.rollback()
         print(f"Error processing user data: {e}")
@@ -829,6 +842,4 @@ def user_integrity_check(data):
         log_text = f"Completed loading file, State {source_status}. Transaction: {transaction_id}"
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
         conn.close()
-
-    return matched_map, missing_map, extra_map
-
+    return matched_map, missing_map, extra_map
\ No newline at end of file


Commit: 1b4b65e1b2552f68df3d277fbd7ac2172924d005
    https://github.com/scummvm/scummvm-sites/commit/1b4b65e1b2552f68df3d277fbd7ac2172924d005
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-18T20:02:22+08:00

Commit Message:
INTEGRITY: Add user_count when matching with 'user'

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 12f815b..0e7ef9b 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -830,7 +830,8 @@ def user_integrity_check(data):
                 populate_file(data, matched_fileset_id, conn, None, src)
                 log_matched_fileset(src, matched_fileset_id, 'partial', user, conn)
             elif status == "user" and count == matched_count:
-                pass
+                add_usercount(matched_fileset_id, conn)
+                log_matched_fileset(src, matched_fileset_id, 'user', user, conn)
             else:
                 insert_new_fileset(data, conn, None, src, key, None, transaction_id, log_text, user)
             finalize_fileset_insertion(conn, transaction_id, src, None, user, 0, source_status, user)
@@ -842,4 +843,8 @@ def user_integrity_check(data):
         log_text = f"Completed loading file, State {source_status}. Transaction: {transaction_id}"
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
         conn.close()
-    return matched_map, missing_map, extra_map
\ No newline at end of file
+    return matched_map, missing_map, extra_map
+
+def add_usercount(fileset, conn):
+    with conn.cursor() as cursor:
+        cursor.execute(f"UPDATE fileset SET user_count = COALESCE(user_count, 0) + 1 WHERE id = {fileset}")


Commit: 641a78cc705bdafbd7c6d3d94c95d7cca3f4e817
    https://github.com/scummvm/scummvm-sites/commit/641a78cc705bdafbd7c6d3d94c95d7cca3f4e817
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-22T19:30:26+08:00

Commit Message:
INTEGRITY: Implemention of validate page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 6f1b918..233e541 100644
--- a/fileset.py
+++ b/fileset.py
@@ -688,7 +688,83 @@ def validate():
 
         fileset_id = user_insert_fileset(json_object['files'], ip, conn)
         json_response['fileset'] = fileset_id
-        # TODO: handle database operations
+
+        return jsonify(json_response)
+
+    conn = db_connect()
+
+    query = f"""
+        SELECT game.id FROM game
+        JOIN engine ON game.engine = engine.id
+        WHERE gameid = '{game_metadata['gameid']}'
+        AND engineid = '{game_metadata['engineid']}'
+        AND platform = '{game_metadata['platform']}'
+        AND language = '{game_metadata['language']}'
+    """
+    games = conn.execute(query).fetchall()  
+
+    if not games:
+        json_response['error'] = error_codes['unknown']
+        del json_response['files']
+        json_response['status'] = 'unknown_variant'
+
+        fileset_id = user_insert_fileset(json_object['files'], ip, conn)
+        json_response['fileset'] = fileset_id
+
+        return jsonify(json_response)
+
+    for game in games:
+        fileset_query = f"""
+            SELECT file.id, name, size FROM file
+            JOIN fileset ON fileset.id = file.fileset
+            WHERE fileset.game = {game['id']} AND
+            (status = 'fullmatch' OR status = 'partialmatch' OR status = 'detection')
+        """
+        fileset = conn.execute(fileset_query).fetchall()  
+
+        if not fileset:
+            continue
+
+        fileset = [dict(row) for row in fileset]
+
+        file_object = json_object['files']
+        file_object.sort(key=lambda x: x['name'].lower())
+        fileset.sort(key=lambda x: x['name'].lower())
+
+        for i in range(min(len(fileset), len(file_object))):
+            status = 'ok'
+            db_file = fileset[i]
+            user_file = file_object[i]
+            filename = user_file['name'].lower()
+
+            if db_file['name'].lower() != filename:
+                if db_file['name'].lower() > filename:
+                    status = 'unknown_file'
+                else:
+                    status = 'missing'
+                    i -= 1  
+
+            elif db_file['size'] != user_file['size'] and status == 'ok':
+                status = 'size_mismatch'
+
+            if status == 'ok':
+                for checksum_data in user_file['checksums']:
+                    user_checkcode = checksum_data['type']
+                    
+                    if user_checkcode in db_file:
+                        user_checksum = checksum_data['checksum']
+                        if db_file[user_checkcode] != user_checksum:
+                            status = 'checksum_mismatch'
+                            break
+
+            if status != 'ok':
+                json_response['error'] = 1
+                fileset_id = user_insert_fileset(json_object['files'], ip, conn)
+                json_response['fileset'] = fileset_id
+
+            json_response['files'].append({'status': status, 'name': filename})
+
+        break
 
         return jsonify(json_response)
     


Commit: c2423eb613abab2fe7d5af9ad907b9e581c1b308
    https://github.com/scummvm/scummvm-sites/commit/c2423eb613abab2fe7d5af9ad907b9e581c1b308
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-23T20:23:36+08:00

Commit Message:
INTEGRITY: Change the relative path of mysql config

Changed paths:
    clear.py
    db_functions.py
    fileset.py
    pagination.py
    schema.py


diff --git a/clear.py b/clear.py
index 34b5c6f..707914c 100644
--- a/clear.py
+++ b/clear.py
@@ -5,6 +5,7 @@ Using it when testing the data insertion.
 
 import pymysql
 import json
+import os
 
 def truncate_all_tables(conn):
     tables = ["filechecksum", "queue", "history", "transactions", "file", "fileset", "game", "engine", "log"]
@@ -24,7 +25,9 @@ def truncate_all_tables(conn):
     cursor.execute("SET FOREIGN_KEY_CHECKS = 1")
 
 if __name__ == "__main__":
-    with open(__file__ + '/../mysql_config.json') as f:
+    base_dir = os.path.dirname(os.path.abspath(__file__))
+    config_path = os.path.join(base_dir, 'mysql_config.json')
+    with open(config_path) as f:
         mysql_cred = json.load(f)
 
     servername = mysql_cred["servername"]
diff --git a/db_functions.py b/db_functions.py
index 0e7ef9b..cfcecea 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -10,7 +10,9 @@ from collections import defaultdict
 import re
 
 def db_connect():
-    with open('mysql_config.json') as f:
+    base_dir = os.path.dirname(os.path.abspath(__file__))
+    config_path = os.path.join(base_dir, 'mysql_config.json')
+    with open(config_path) as f:
         mysql_cred = json.load(f)
     
     conn = pymysql.connect(
diff --git a/fileset.py b/fileset.py
index 233e541..bdc90ae 100644
--- a/fileset.py
+++ b/fileset.py
@@ -7,14 +7,16 @@ from user_fileset_functions import user_calc_key, file_json_to_array, user_inser
 from pagination import create_page
 import difflib
 from pymysql.converters import escape_string
-from db_functions import find_matching_filesets, get_all_related_filesets, convert_log_text_to_links, user_integrity_check
+from db_functions import find_matching_filesets, get_all_related_filesets, convert_log_text_to_links, user_integrity_check, db_connect
 from collections import defaultdict
 
 app = Flask(__name__)
 
 secret_key = os.urandom(24)
 
-with open('mysql_config.json') as f:
+base_dir = os.path.dirname(os.path.abspath(__file__))
+config_path = os.path.join(base_dir, 'mysql_config.json')
+with open(config_path) as f:
     mysql_cred = json.load(f)
 
 conn = pymysql.connect(
@@ -58,7 +60,9 @@ def fileset():
     id = request.args.get('id', default=1, type=int)
     widetable = request.args.get('widetable', default='false', type=str)
     # Load MySQL credentials from a JSON file
-    with open('mysql_config.json') as f:
+    base_dir = os.path.dirname(os.path.abspath(__file__))
+    config_path = os.path.join(base_dir, 'mysql_config.json')
+    with open(config_path) as f:
         mysql_cred = json.load(f)
 
     # Create a connection to the MySQL server
@@ -255,7 +259,9 @@ def fileset():
 
 @app.route('/fileset/<int:id>/match', methods=['GET'])
 def match_fileset_route(id):
-    with open('mysql_config.json') as f:
+    base_dir = os.path.dirname(os.path.abspath(__file__))
+    config_path = os.path.join(base_dir, 'mysql_config.json')
+    with open(config_path) as f:
         mysql_cred = json.load(f)
 
     connection = pymysql.connect(host=mysql_cred["servername"],
@@ -351,7 +357,9 @@ def merge_fileset(id):
     if request.method == 'POST':
         search_query = request.form['search']
         
-        with open('mysql_config.json') as f:
+        base_dir = os.path.dirname(os.path.abspath(__file__))
+        config_path = os.path.join(base_dir, 'mysql_config.json')
+        with open(config_path) as f:
             mysql_cred = json.load(f)
 
         connection = pymysql.connect(
@@ -436,7 +444,9 @@ def merge_fileset(id):
 def confirm_merge(id):
     target_id = request.args.get('target_id', type=int) if request.method == 'GET' else request.form.get('target_id')
 
-    with open('mysql_config.json') as f:
+    base_dir = os.path.dirname(os.path.abspath(__file__))
+    config_path = os.path.join(base_dir, 'mysql_config.json')
+    with open(config_path) as f:
         mysql_cred = json.load(f)
 
     connection = pymysql.connect(
@@ -544,7 +554,9 @@ def execute_merge(id, source=None, target=None):
     source_id = request.form['source_id'] if not source else source
     target_id = request.form['target_id'] if not target else target
 
-    with open('mysql_config.json') as f:
+    base_dir = os.path.dirname(os.path.abspath(__file__))
+    config_path = os.path.join(base_dir, 'mysql_config.json')
+    with open(config_path) as f:
         mysql_cred = json.load(f)
 
     connection = pymysql.connect(
@@ -766,7 +778,7 @@ def validate():
 
         break
 
-        return jsonify(json_response)
+    return jsonify(json_response)
     
 @app.route('/user_games_list')
 def user_games_list():
diff --git a/pagination.py b/pagination.py
index c6b38d2..1310bbe 100644
--- a/pagination.py
+++ b/pagination.py
@@ -19,7 +19,9 @@ def get_join_columns(table1, table2, mapping):
     return "No primary-foreign key mapping provided. Filter is invalid"
 
 def create_page(filename, results_per_page, records_table, select_query, order, filters={}, mapping={}):
-    with open(os.path.join(os.path.dirname(__file__), 'mysql_config.json')) as f:
+    base_dir = os.path.dirname(os.path.abspath(__file__))
+    config_path = os.path.join(base_dir, 'mysql_config.json')
+    with open(config_path) as f:
         mysql_cred = json.load(f)
     
     conn = pymysql.connect(
diff --git a/schema.py b/schema.py
index 3333bb5..a94b81e 100644
--- a/schema.py
+++ b/schema.py
@@ -3,9 +3,12 @@ import pymysql
 import random
 import string
 from datetime import datetime
+import os
 
 # Load MySQL credentials
-with open(__file__ + '/../mysql_config.json') as f:
+base_dir = os.path.dirname(os.path.abspath(__file__))
+config_path = os.path.join(base_dir, 'mysql_config.json')
+with open(config_path) as f:
     mysql_cred = json.load(f)
 
 servername = mysql_cred["servername"]


Commit: 056a04bd72ede498ae5f5250c4df500329b9c5e7
    https://github.com/scummvm/scummvm-sites/commit/056a04bd72ede498ae5f5250c4df500329b9c5e7
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-23T20:25:34+08:00

Commit Message:
INTEGRITY: Change the parameter passing to the user_insert_fileset

Changed paths:
    db_functions.py
    fileset.py
    user_fileset_functions.py


diff --git a/db_functions.py b/db_functions.py
index cfcecea..d89e57c 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -251,9 +251,14 @@ def calc_key(fileset):
 
 def calc_megakey(fileset):
     key_string = f":{fileset['platform']}:{fileset['language']}"
-    for file in fileset['rom']:
-        for key, value in file.items():
-            key_string += ':' + str(value)
+    if 'rom' in fileset.keys():
+        for file in fileset['rom']:
+            for key, value in file.items():
+                key_string += ':' + str(value)
+    elif 'files' in fileset.keys():
+        for file in fileset['files']:
+            for key, value in file.items():
+                key_string += ':' + str(value)
 
     key_string = key_string.strip(':')
     return hashlib.md5(key_string.encode()).hexdigest()
diff --git a/fileset.py b/fileset.py
index bdc90ae..1ba2d1e 100644
--- a/fileset.py
+++ b/fileset.py
@@ -702,8 +702,12 @@ def validate():
         json_response['fileset'] = fileset_id
 
         return jsonify(json_response)
-
-    conn = db_connect()
+    try:
+        conn = db_connect()
+        print("Database connection successful")
+    except Exception as e:
+        print(f"Error connecting to database: {e}")
+        return jsonify({'error': 'Database connection failed'}), 500
 
     query = f"""
         SELECT game.id FROM game
@@ -713,71 +717,80 @@ def validate():
         AND platform = '{game_metadata['platform']}'
         AND language = '{game_metadata['language']}'
     """
-    games = conn.execute(query).fetchall()  
-
-    if not games:
-        json_response['error'] = error_codes['unknown']
-        del json_response['files']
-        json_response['status'] = 'unknown_variant'
-
-        fileset_id = user_insert_fileset(json_object['files'], ip, conn)
-        json_response['fileset'] = fileset_id
-
-        return jsonify(json_response)
-
-    for game in games:
-        fileset_query = f"""
-            SELECT file.id, name, size FROM file
-            JOIN fileset ON fileset.id = file.fileset
-            WHERE fileset.game = {game['id']} AND
-            (status = 'fullmatch' OR status = 'partialmatch' OR status = 'detection')
-        """
-        fileset = conn.execute(fileset_query).fetchall()  
-
-        if not fileset:
-            continue
-
-        fileset = [dict(row) for row in fileset]
-
-        file_object = json_object['files']
-        file_object.sort(key=lambda x: x['name'].lower())
-        fileset.sort(key=lambda x: x['name'].lower())
-
-        for i in range(min(len(fileset), len(file_object))):
-            status = 'ok'
-            db_file = fileset[i]
-            user_file = file_object[i]
-            filename = user_file['name'].lower()
-
-            if db_file['name'].lower() != filename:
-                if db_file['name'].lower() > filename:
-                    status = 'unknown_file'
-                else:
-                    status = 'missing'
-                    i -= 1  
+    try:
+        with conn.cursor() as cursor:
+            cursor.execute(query)
+            games = cursor.fetchall()  
 
-            elif db_file['size'] != user_file['size'] and status == 'ok':
-                status = 'size_mismatch'
+            if not games:
+                json_response['error'] = error_codes['unknown']
+                del json_response['files']
+                json_response['status'] = 'unknown_variant'
 
-            if status == 'ok':
-                for checksum_data in user_file['checksums']:
-                    user_checkcode = checksum_data['type']
-                    
-                    if user_checkcode in db_file:
-                        user_checksum = checksum_data['checksum']
-                        if db_file[user_checkcode] != user_checksum:
-                            status = 'checksum_mismatch'
-                            break
-
-            if status != 'ok':
-                json_response['error'] = 1
                 fileset_id = user_insert_fileset(json_object['files'], ip, conn)
                 json_response['fileset'] = fileset_id
 
-            json_response['files'].append({'status': status, 'name': filename})
+                return jsonify(json_response)
+            # print(games)
+            for game in games:
+                fileset_query = f"""
+                    SELECT file.id, name, size FROM file
+                    JOIN fileset ON fileset.id = file.fileset
+                    WHERE fileset.game = {game['id']} AND
+                    (status = 'full' OR status = 'detection' OR status = 'partial')
+                """
+                cursor.execute(fileset_query)
+                fileset = cursor.fetchall()  
 
-        break
+                if not fileset:
+                    continue
 
+                fileset = [dict(row) for row in fileset]
+
+                file_object = json_object['files']
+                file_object.sort(key=lambda x: x['name'].lower())
+                fileset.sort(key=lambda x: x['name'].lower())
+                # print(file_object)
+                for i in range(min(len(fileset), len(file_object))):
+                    status = 'ok'
+                    db_file = fileset[i]
+                    user_file = file_object[i]
+                    filename = user_file['name'].lower()
+
+                    if db_file['name'].lower() != filename:
+                        if db_file['name'].lower() > filename:
+                            status = 'unknown_file'
+                        else:
+                            status = 'missing'
+                            i -= 1  
+
+                    elif db_file['size'] != user_file['size'] and status == 'ok':
+                        status = 'size_mismatch'
+
+                    if status == 'ok':
+                        for checksum_data in user_file['checksums']:
+                            user_checkcode = checksum_data['type']
+                            
+                            if user_checkcode in db_file:
+                                user_checksum = checksum_data['checksum']
+                                if db_file[user_checkcode] != user_checksum:
+                                    status = 'checksum_mismatch'
+                                    break
+
+                    if status != 'ok':
+                        json_response['error'] = 1
+                        fileset_id = user_insert_fileset(json_object['files'], ip, conn)
+                        json_response['fileset'] = fileset_id
+
+                    json_response['files'].append({'status': status, 'name': filename})
+
+                break
+    except Exception as e:
+        print(f"Error executing query: {e}")
+        return jsonify({'error': 'Query execution failed'}), 500
+    finally:
+        conn.close()
+    # print(json_response)
     return jsonify(json_response)
     
 @app.route('/user_games_list')
@@ -1002,4 +1015,4 @@ def upload_file():
 
 if __name__ == '__main__':
     app.secret_key = secret_key
-    app.run(debug=True)
+    app.run(debug=True, host='0.0.0.0')
diff --git a/user_fileset_functions.py b/user_fileset_functions.py
index 871c6f5..2d11d43 100644
--- a/user_fileset_functions.py
+++ b/user_fileset_functions.py
@@ -44,7 +44,7 @@ def user_insert_fileset(user_fileset, ip, conn):
         log_text = "from user submitted files"
         cursor.execute("SET @fileset_time_last = %s", (int(time.time()),))
         if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, ip):
-            for file in user_fileset:
+            for file in user_fileset['files']:
                 file = file_json_to_array(file)
                 insert_file(file, detection, src, conn)
                 for key, value in file.items():


Commit: a0528229163f604a007f48f532201e8bcff12757
    https://github.com/scummvm/scummvm-sites/commit/a0528229163f604a007f48f532201e8bcff12757
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-24T19:41:40+08:00

Commit Message:
INTEGRITY: Add user_count when handling duplicate inserts

Changed paths:
    db_functions.py
    fileset.py
    user_fileset_functions.py


diff --git a/db_functions.py b/db_functions.py
index d89e57c..6b42fad 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -98,7 +98,10 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
             cursor.execute(f"DELETE FROM file WHERE fileset = {existing_entry}")
             cursor.execute(f"UPDATE fileset SET `timestamp` = FROM_UNIXTIME(@fileset_time_last) WHERE id = {existing_entry}")
             cursor.execute(f"UPDATE fileset SET status = 'detection' WHERE id = {existing_entry} AND status = 'obsolete'")
-
+            cursor.execute(f"SELECT status FROM fileset WHERE id = {existing_entry}")
+            status = cursor.fetchone()['status']
+        if status == 'user':
+            add_usercount(existing_entry, conn)
         category_text = f"Updated Fileset:{existing_entry}"
         log_text = f"Updated Fileset:{existing_entry}, {log_text}"
         user = f'cli:{getpass.getuser()}' if username is None else username
diff --git a/fileset.py b/fileset.py
index 1ba2d1e..650da8a 100644
--- a/fileset.py
+++ b/fileset.py
@@ -783,7 +783,7 @@ def validate():
                         json_response['fileset'] = fileset_id
 
                     json_response['files'].append({'status': status, 'name': filename})
-
+                    
                 break
     except Exception as e:
         print(f"Error executing query: {e}")
diff --git a/user_fileset_functions.py b/user_fileset_functions.py
index 2d11d43..a7ef1de 100644
--- a/user_fileset_functions.py
+++ b/user_fileset_functions.py
@@ -35,7 +35,7 @@ def user_insert_queue(user_fileset, conn):
 
 def user_insert_fileset(user_fileset, ip, conn):
     src = 'user'
-    detection = False
+    detection = True
     key = ''
     megakey = calc_megakey(user_fileset)
     with conn.cursor() as cursor:
@@ -83,7 +83,7 @@ def match_and_merge_user_filesets(id):
 
         matched_game = matching_games[0]
 
-        status = 'fullmatch'
+        status = 'full'
 
         # Convert NULL values to string with value NULL for printing
         matched_game = {k: 'NULL' if v is None else v for k, v in matched_game.items()}
@@ -138,7 +138,7 @@ def match_and_merge_user_filesets(id):
         if len(matching_games) != 1:
             continue
         matched_game = matching_games[0]
-        status = 'fullmatch'
+        status = 'full'
         matched_game = {k: ("NULL" if v is None else v) for k, v in matched_game.items()}
         category_text = f"Matched from {fileset[0]['src']}"
         log_text = f"Matched game {matched_game['engineid']}: {matched_game['gameid']}-{matched_game['platform']}-{matched_game['language']} variant {matched_game['key']}. State {status}. Fileset:{fileset[0]['id']}."


Commit: ee42693336afbc19520367694edd50fc4a1b5ef7
    https://github.com/scummvm/scummvm-sites/commit/ee42693336afbc19520367694edd50fc4a1b5ef7
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-26T16:52:10+08:00

Commit Message:
INTEGRITY: Refactor the logic for user integrity check

Changed paths:
    db_functions.py
    fileset.py
    user_fileset_functions.py


diff --git a/db_functions.py b/db_functions.py
index 6b42fad..d711b3d 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -141,6 +141,7 @@ def insert_file(file, detection, src, conn):
     # Find full md5, or else use first checksum value
     checksum = ""
     checksize = 5000
+    checktype = "None"
     if "md5" in file:
         checksum = file["md5"]
     else:
@@ -667,7 +668,7 @@ def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, meg
                 is_full_matched == True
                 log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
                 return
-            elif status == 'partial' and count == matched_count:
+            elif (status == 'partial' or status == 'dat') and count == matched_count:
                 update_fileset_status(cursor, matched_fileset_id, 'full')
                 populate_file(fileset, matched_fileset_id, conn, detection)
                 log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
diff --git a/fileset.py b/fileset.py
index 650da8a..5979f2e 100644
--- a/fileset.py
+++ b/fileset.py
@@ -123,8 +123,8 @@ def fileset():
                 cursor.execute(f"SELECT game.name as 'game name', engineid, gameid, extra, platform, language FROM fileset JOIN game ON game.id = fileset.game JOIN engine ON engine.id = game.engine WHERE fileset.id = {id}")
                 result = {**result, **cursor.fetchone()}
             else:
-                result.pop('key', None)
-                result.pop('status', None)
+                # result.pop('key', None)
+                # result.pop('status', None)
                 result.pop('delete', None)
 
             for column in result.keys():
@@ -697,100 +697,40 @@ def validate():
         json_response['error'] = error_codes['no_metadata']
         del json_response['files']
         json_response['status'] = 'no_metadata'
-
-        fileset_id = user_insert_fileset(json_object['files'], ip, conn)
+        
+        fileset_id = user_insert_fileset(json_object, ip, conn)
         json_response['fileset'] = fileset_id
+        print(f"Response: {json_response}")
+        return jsonify(json_response)
+
+    matched_map = {}
+    missing_map = {}
+    extra_map = {}
 
+    file_object = json_object['files']
+    if not file_object:
+        json_response['error'] = error_codes['empty']
+        json_response['status'] = 'empty_fileset'
         return jsonify(json_response)
+
     try:
-        conn = db_connect()
-        print("Database connection successful")
+        matched_map, missing_map, extra_map = user_integrity_check(json_object)
     except Exception as e:
-        print(f"Error connecting to database: {e}")
-        return jsonify({'error': 'Database connection failed'}), 500
-
-    query = f"""
-        SELECT game.id FROM game
-        JOIN engine ON game.engine = engine.id
-        WHERE gameid = '{game_metadata['gameid']}'
-        AND engineid = '{game_metadata['engineid']}'
-        AND platform = '{game_metadata['platform']}'
-        AND language = '{game_metadata['language']}'
-    """
-    try:
-        with conn.cursor() as cursor:
-            cursor.execute(query)
-            games = cursor.fetchall()  
-
-            if not games:
-                json_response['error'] = error_codes['unknown']
-                del json_response['files']
-                json_response['status'] = 'unknown_variant'
-
-                fileset_id = user_insert_fileset(json_object['files'], ip, conn)
-                json_response['fileset'] = fileset_id
-
-                return jsonify(json_response)
-            # print(games)
-            for game in games:
-                fileset_query = f"""
-                    SELECT file.id, name, size FROM file
-                    JOIN fileset ON fileset.id = file.fileset
-                    WHERE fileset.game = {game['id']} AND
-                    (status = 'full' OR status = 'detection' OR status = 'partial')
-                """
-                cursor.execute(fileset_query)
-                fileset = cursor.fetchall()  
+        json_response['error'] = 1
+        json_response['status'] = 'processing_error'
+        json_response['message'] = str(e)
+        print(f"Response: {json_response}")
+        return jsonify(json_response)
 
-                if not fileset:
-                    continue
+    for fileset_id, count in matched_map.items():
+        json_response['files'].append({'status': 'ok', 'fileset_id': fileset_id, 'count': count})
+        # TODO: Handle the exact file names and checksums
+    for fileset_id, count in missing_map.items():
+        json_response['files'].append({'status': 'missing', 'fileset_id': fileset_id, 'count': len(count)})
 
-                fileset = [dict(row) for row in fileset]
-
-                file_object = json_object['files']
-                file_object.sort(key=lambda x: x['name'].lower())
-                fileset.sort(key=lambda x: x['name'].lower())
-                # print(file_object)
-                for i in range(min(len(fileset), len(file_object))):
-                    status = 'ok'
-                    db_file = fileset[i]
-                    user_file = file_object[i]
-                    filename = user_file['name'].lower()
-
-                    if db_file['name'].lower() != filename:
-                        if db_file['name'].lower() > filename:
-                            status = 'unknown_file'
-                        else:
-                            status = 'missing'
-                            i -= 1  
-
-                    elif db_file['size'] != user_file['size'] and status == 'ok':
-                        status = 'size_mismatch'
-
-                    if status == 'ok':
-                        for checksum_data in user_file['checksums']:
-                            user_checkcode = checksum_data['type']
-                            
-                            if user_checkcode in db_file:
-                                user_checksum = checksum_data['checksum']
-                                if db_file[user_checkcode] != user_checksum:
-                                    status = 'checksum_mismatch'
-                                    break
-
-                    if status != 'ok':
-                        json_response['error'] = 1
-                        fileset_id = user_insert_fileset(json_object['files'], ip, conn)
-                        json_response['fileset'] = fileset_id
-
-                    json_response['files'].append({'status': status, 'name': filename})
-                    
-                break
-    except Exception as e:
-        print(f"Error executing query: {e}")
-        return jsonify({'error': 'Query execution failed'}), 500
-    finally:
-        conn.close()
-    # print(json_response)
+    for fileset_id, count in extra_map.items():
+        json_response['files'].append({'status': 'unknown_file', 'fileset_id': fileset_id, 'count': len(count)})
+    print(f"Response: {json_response}")
     return jsonify(json_response)
     
 @app.route('/user_games_list')
diff --git a/user_fileset_functions.py b/user_fileset_functions.py
index a7ef1de..80e0c1a 100644
--- a/user_fileset_functions.py
+++ b/user_fileset_functions.py
@@ -35,14 +35,14 @@ def user_insert_queue(user_fileset, conn):
 
 def user_insert_fileset(user_fileset, ip, conn):
     src = 'user'
-    detection = True
+    detection = False
     key = ''
     megakey = calc_megakey(user_fileset)
     with conn.cursor() as cursor:
         cursor.execute("SELECT MAX(`transaction`) FROM transactions")
         transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
         log_text = "from user submitted files"
-        cursor.execute("SET @fileset_time_last = %s", (int(time.time()),))
+        cursor.execute("SET @fileset_time_last = %s", (int(time.time())))
         if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, ip):
             for file in user_fileset['files']:
                 file = file_json_to_array(file)


Commit: b6657c289762a891ab6e59369b2bcef8a5cc8233
    https://github.com/scummvm/scummvm-sites/commit/b6657c289762a891ab6e59369b2bcef8a5cc8233
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-30T08:24:37+08:00

Commit Message:
INTEGRITY: Improve the matching of user's json

Changed paths:
    db_functions.py
    fileset.py


diff --git a/db_functions.py b/db_functions.py
index d711b3d..014e35c 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -152,6 +152,7 @@ def insert_file(file, detection, src, conn):
 
     if not detection:
         checktype = "None"
+        detection = 0
     query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{checktype}-{checksize}', NOW())"
     with conn.cursor() as cursor:
         cursor.execute(query)
@@ -592,11 +593,11 @@ def insert_game_data(fileset, conn):
     insert_game(engine_name, engineid, title, gameid, extra, platform, lang, conn)
 
 def find_matching_filesets(fileset, conn, status):
-    matched_map = defaultdict(int)
+    matched_map = defaultdict(list)
     if status != "user":
         state = """'detection', 'dat', 'scan', 'partial', 'full', 'obsolete'"""
     else:
-        state = """'user', 'partial', 'full'"""
+        state = """'partial', 'full', 'dat'"""
     with conn.cursor() as cursor:
         for file in fileset["rom"]:
             matched_set = set()
@@ -618,9 +619,8 @@ def find_matching_filesets(fileset, conn, status):
                             matched_set.add(record['fileset_id'])
 
             for id in matched_set:
-                matched_map[id] += 1
+                matched_map[id].append(file)
                         
-    print(matched_map)
     return matched_map
 
 def matching_set(fileset, conn):
@@ -648,7 +648,7 @@ def matching_set(fileset, conn):
     return matched_map
 
 def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user):
-    matched_list = sorted(matched_map.items(), key=lambda x: x[1], reverse=True)
+    matched_list = sorted(matched_map.items(), key=lambda x: len(x[1]), reverse=True)
     is_full_matched = False
     with conn.cursor() as cursor:
         for matched_fileset_id, matched_count in matched_list:
@@ -659,7 +659,7 @@ def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, meg
             cursor.execute(f"SELECT COUNT(file.id) FROM file WHERE fileset = {matched_fileset_id}")
             count = cursor.fetchone()['COUNT(file.id)']
 
-            if status in ['detection', 'obsolete'] and count == matched_count:
+            if status in ['detection', 'obsolete'] and count == len(matched_count):
                 is_full_matched = True
                 update_fileset_status(cursor, matched_fileset_id, 'full' if src != "dat" else "partial")
                 populate_file(fileset, matched_fileset_id, conn, detection)
@@ -668,11 +668,11 @@ def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, meg
                 is_full_matched == True
                 log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
                 return
-            elif (status == 'partial' or status == 'dat') and count == matched_count:
+            elif (status == 'partial' or status == 'dat') and count == len(matched_count):
                 update_fileset_status(cursor, matched_fileset_id, 'full')
                 populate_file(fileset, matched_fileset_id, conn, detection)
                 log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
-            elif status == 'scan' and count == matched_count:
+            elif status == 'scan' and count == len(matched_count):
                 log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
                 return
             elif src == 'dat':
@@ -739,8 +739,9 @@ def finalize_fileset_insertion(conn, transaction_id, src, filepath, author, vers
         cursor.execute(f"SELECT COUNT(fileset) from transactions WHERE `transaction` = {transaction_id}")
         fileset_insertion_count = cursor.fetchone()['COUNT(fileset)']
         category_text = f"Uploaded from {src}"
-        log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
-        create_log(escape_string(category_text), user, escape_string(log_text), conn)
+        if src != 'user':
+            log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
+            create_log(escape_string(category_text), user, escape_string(log_text), conn)
     conn.close()
 
 def user_integrity_check(data):
@@ -787,6 +788,8 @@ def user_integrity_check(data):
             # show matched, missing, extra
             extra_map = defaultdict(list)
             missing_map = defaultdict(list)
+            extra_set = set()
+            missing_set = set()
             
             for fileset_id in matched_map.keys():
                 cursor.execute(f"SELECT * FROM file WHERE fileset = {fileset_id}")
@@ -809,10 +812,18 @@ def user_integrity_check(data):
                         data_files_set.add(checksum)
                 
                 # Identify missing files
+                matched_names = set()
                 for checksum, target_file in target_files_dict.items():
                     if checksum not in data_files_set:
-                        
-                        missing_map[fileset_id].append(target_file)
+                        if target_file['name'] not in matched_names:
+                            missing_set.add(target_file['name'])
+                        else:
+                            missing_set.remove(target_file['name'])
+                    else:
+                        matched_names.add(target_file['name'])  
+                
+                for tar in missing_set:
+                    missing_map[fileset_id].append({'name': tar})
 
                 # Identify extra files
                 for file in data['files']:
@@ -824,10 +835,13 @@ def user_integrity_check(data):
                         if checksum in target_files_dict and not file_exists:
                             file_exists = True
                     if not file_exists:
-                        extra_map[fileset_id].append(file)
+                        extra_set.add(file['name'])
+                
+                for extra in extra_set:
+                    extra_map[fileset_id].append({'name': extra})
             
             # handle different scenarios
-            matched_list = sorted(matched_map.items(), key=lambda x: x[1], reverse=True)
+            matched_list = sorted(matched_map.items(), key=lambda x: len(x[1]), reverse=True)
             most_matched = matched_list[0] 
             matched_fileset_id, matched_count = most_matched[0], most_matched[1]   
             cursor.execute(f"SELECT status FROM fileset WHERE id = {matched_fileset_id}")
diff --git a/fileset.py b/fileset.py
index 5979f2e..3e96cca 100644
--- a/fileset.py
+++ b/fileset.py
@@ -330,7 +330,7 @@ def match_fileset_route(id):
                 html += f"""
                 <tr>
                     <td>{fileset_id}</td>
-                    <td>{match_count}</td>
+                    <td>{len(match_count)}</td>
                     <td><a href="/fileset?id={fileset_id}">View Details</a></td>
                     <td>
                         <form method="POST" action="/fileset/{id}/merge/confirm">
@@ -667,7 +667,6 @@ def execute_merge(id, source=None, target=None):
 
 @app.route('/validate', methods=['POST'])
 def validate():
-
     error_codes = {
         "unknown": -1,
         "success": 0,
@@ -721,18 +720,39 @@ def validate():
         json_response['message'] = str(e)
         print(f"Response: {json_response}")
         return jsonify(json_response)
-
-    for fileset_id, count in matched_map.items():
-        json_response['files'].append({'status': 'ok', 'fileset_id': fileset_id, 'count': count})
-        # TODO: Handle the exact file names and checksums
+    print(f"Matched: {matched_map}")
+    matched_map = list(sorted(matched_map.items(), key=lambda x: len(x[1]), reverse=True))[0]
+    matched_id = matched_map[0]
+    # find the same id in the missing_map and extra_map
     for fileset_id, count in missing_map.items():
-        json_response['files'].append({'status': 'missing', 'fileset_id': fileset_id, 'count': len(count)})
-
+        if fileset_id == matched_id:
+            missing_map = (fileset_id, count)
+            break
+    
     for fileset_id, count in extra_map.items():
-        json_response['files'].append({'status': 'unknown_file', 'fileset_id': fileset_id, 'count': len(count)})
+        if fileset_id == matched_id:
+            extra_map = (fileset_id, count)
+            break
+    
+    for file in matched_map[1]:
+        for key, value in file.items():
+            if key == "name":
+                json_response['files'].append({'status': 'ok', 'fileset_id':matched_id, 'name': value})
+                break
+    for file in missing_map[1]:
+        for key, value in file.items():
+            if key == "name":
+                json_response['files'].append({'status': 'missing', 'fileset_id':matched_id, 'name': value})
+                break
+    for file in extra_map[1]:
+        for key, value in file.items():
+            if key == "name":
+                json_response['files'].append({'status': 'unknown_file', 'fileset_id':matched_id, 'name': value})
+                break
     print(f"Response: {json_response}")
     return jsonify(json_response)
     
+    
 @app.route('/user_games_list')
 def user_games_list():
     filename = "user_games_list"


Commit: ba18ccd1d57bdc72f3438aca860e8b28d4e476de
    https://github.com/scummvm/scummvm-sites/commit/ba18ccd1d57bdc72f3438aca860e8b28d4e476de
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-30T17:59:59+08:00

Commit Message:
INTEGRITY: Create a new fileset first before the matching

Changed paths:
    db_functions.py
    fileset.py


diff --git a/db_functions.py b/db_functions.py
index 014e35c..9a5122f 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -131,7 +131,7 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
         log_last = create_log(escape_string(category_text), user, escape_string(log_text), conn)
         update_history(fileset_last, fileset_last, conn, log_last)
     else:
-        update_history(fileset_last, fileset_last, conn)
+        update_history(0, fileset_last, conn)
     with conn.cursor() as cursor:
         cursor.execute(f"INSERT INTO transactions (`transaction`, fileset) VALUES ({transaction}, {fileset_last})")
 
@@ -153,7 +153,8 @@ def insert_file(file, detection, src, conn):
     if not detection:
         checktype = "None"
         detection = 0
-    query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{checktype}-{checksize}', NOW())"
+    detection_type = f"{checktype}-{checksize}" if checktype != "None" else f"{checktype}"
+    query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{detection_type}', NOW())"
     with conn.cursor() as cursor:
         cursor.execute(query)
 
@@ -214,7 +215,7 @@ def get_all_related_filesets(fileset_id, conn, visited=None):
     if visited is None:
         visited = set()
 
-    if fileset_id in visited:
+    if fileset_id in visited or fileset_id == 0:
         return []
     
     visited.add(fileset_id)
@@ -577,10 +578,14 @@ def process_fileset(fileset, resources, detection, src, conn, transaction_id, fi
     else:
         matched_map = matching_set(fileset, conn)
 
+    
+    insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
+    with conn.cursor() as cursor:
+        cursor.execute("SET @fileset_last = LAST_INSERT_ID()")
+        cursor.execute("SELECT LAST_INSERT_ID()")
+        fileset_last = cursor.fetchone()['LAST_INSERT_ID()']
     if matched_map:
-        handle_matched_filesets(matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
-    else:
-        insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
+        handle_matched_filesets(fileset_last, matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
 
 def insert_game_data(fileset, conn):
     engine_name = fileset["engine"]
@@ -647,7 +652,7 @@ def matching_set(fileset, conn):
                     break
     return matched_map
 
-def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user):
+def handle_matched_filesets(fileset_last, matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user):
     matched_list = sorted(matched_map.items(), key=lambda x: len(x[1]), reverse=True)
     is_full_matched = False
     with conn.cursor() as cursor:
@@ -663,20 +668,20 @@ def handle_matched_filesets(matched_map, fileset, conn, detection, src, key, meg
                 is_full_matched = True
                 update_fileset_status(cursor, matched_fileset_id, 'full' if src != "dat" else "partial")
                 populate_file(fileset, matched_fileset_id, conn, detection)
-                log_matched_fileset(src, matched_fileset_id, 'full' if src != "dat" else "partial", user, conn)
+                log_matched_fileset(src, fileset_last, matched_fileset_id, 'full' if src != "dat" else "partial", user, conn)
             elif status == 'full' and len(fileset['rom']) == count:
                 is_full_matched == True
-                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+                log_matched_fileset(src, fileset_last, matched_fileset_id, 'full', user, conn)
                 return
             elif (status == 'partial' or status == 'dat') and count == len(matched_count):
                 update_fileset_status(cursor, matched_fileset_id, 'full')
                 populate_file(fileset, matched_fileset_id, conn, detection)
-                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+                log_matched_fileset(src, fileset_last, matched_fileset_id, 'full', user, conn)
             elif status == 'scan' and count == len(matched_count):
-                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+                log_matched_fileset(src, fileset_last, matched_fileset_id, 'full', user, conn)
                 return
             elif src == 'dat':
-                log_matched_fileset(src, matched_fileset_id, 'partial matched', user, conn)
+                log_matched_fileset(src, fileset_last, matched_fileset_id, 'partial matched', user, conn)
             else:
                 insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
 
@@ -728,11 +733,11 @@ def insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_
                 if key not in ["name", "size"]:
                     insert_filechecksum(file, key, conn)
 
-def log_matched_fileset(src, fileset_id, state, user, conn):
+def log_matched_fileset(src, fileset_last, fileset_id, state, user, conn):
     category_text = f"Matched from {src}"
     log_text = f"Matched Fileset:{fileset_id}. State {state}."
     log_last = create_log(escape_string(category_text), user, escape_string(log_text), conn)
-    update_history(fileset_id, fileset_id, conn, log_last)
+    update_history(fileset_last, fileset_id, conn, log_last)
 
 def finalize_fileset_insertion(conn, transaction_id, src, filepath, author, version, source_status, user):
     with conn.cursor() as cursor:
@@ -850,13 +855,13 @@ def user_integrity_check(data):
             cursor.execute(f"SELECT COUNT(file.id) FROM file WHERE fileset = {matched_fileset_id}")
             count = cursor.fetchone()['COUNT(file.id)']
             if status == "full" and count == matched_count:
-                log_matched_fileset(src, matched_fileset_id, 'full', user, conn)
+                log_matched_fileset(src, matched_fileset_id, matched_fileset_id, 'full', user, conn)
             elif status == "partial" and count == matched_count:
                 populate_file(data, matched_fileset_id, conn, None, src)
-                log_matched_fileset(src, matched_fileset_id, 'partial', user, conn)
+                log_matched_fileset(src, matched_fileset_id, matched_fileset_id, 'partial', user, conn)
             elif status == "user" and count == matched_count:
                 add_usercount(matched_fileset_id, conn)
-                log_matched_fileset(src, matched_fileset_id, 'user', user, conn)
+                log_matched_fileset(src, matched_fileset_id, matched_fileset_id, 'user', user, conn)
             else:
                 insert_new_fileset(data, conn, None, src, key, None, transaction_id, log_text, user)
             finalize_fileset_insertion(conn, transaction_id, src, None, user, 0, source_status, user)
diff --git a/fileset.py b/fileset.py
index 3e96cca..f976f9d 100644
--- a/fileset.py
+++ b/fileset.py
@@ -238,6 +238,25 @@ def fileset():
                 cursor.execute(f"SELECT `timestamp`, category, `text`, id FROM log WHERE `text` LIKE 'Fileset:{h['oldfileset']}' ORDER BY `timestamp` DESC, id DESC")
                 logs = cursor.fetchall()
                 print(f"Logs: {logs}")
+                if h['fileset'] == h['oldfileset']:
+                    continue
+
+                if h['oldfileset'] == 0:
+                    html += "<tr>\n"
+                    html += f"<td>{h['timestamp']}</td>\n"
+                    html += f"<td>create</td>\n"
+                    html += f"<td>Created fileset <a href='fileset?id={h['fileset']}'>Fileset {h['fileset']}</a></td>\n"
+                    # html += f"<td><a href='logs?id={h['log']}'>Log {h['log']}</a></td>\n"
+                    if h['log']:
+                        cursor.execute(f"SELECT `text` FROM log WHERE id = {h['log']}")
+                        log_text = cursor.fetchone()['text']
+                        log_text = convert_log_text_to_links(log_text)
+                        html += f"<td><a href='logs?id={h['log']}'>Log {h['log']}</a>: {log_text}</td>\n"
+                    else:
+                        html += "<td>No log available</td>\n"
+                    html += "</tr>\n"
+                    continue
+
                 html += "<tr>\n"
                 html += f"<td>{h['timestamp']}</td>\n"
                 html += f"<td>merge</td>\n"
@@ -327,10 +346,12 @@ def match_fileset_route(id):
             for fileset_id, match_count in matched_map.items():
                 if fileset_id == id:
                     continue
+                cursor.execute(f"SELECT COUNT(file.id) FROM file WHERE fileset = {fileset_id}")
+                count = cursor.fetchone()['COUNT(file.id)']
                 html += f"""
                 <tr>
                     <td>{fileset_id}</td>
-                    <td>{len(match_count)}</td>
+                    <td>{len(match_count)} / {count}</td>
                     <td><a href="/fileset?id={fileset_id}">View Details</a></td>
                     <td>
                         <form method="POST" action="/fileset/{id}/merge/confirm">


Commit: 44b91b5d4c4d05c9647bea22a53f1a30a2e67cb3
    https://github.com/scummvm/scummvm-sites/commit/44b91b5d4c4d05c9647bea22a53f1a30a2e67cb3
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-31T19:30:36+08:00

Commit Message:
INTEGRITY: Add hyperlinks to fileset table and game table

Changed paths:
    db_functions.py
    pagination.py


diff --git a/db_functions.py b/db_functions.py
index 9a5122f..3c94b22 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -641,7 +641,7 @@ def matching_set(fileset, conn):
                     JOIN file f ON fs.id = f.fileset
                     JOIN filechecksum fc ON f.id = fc.file
                     WHERE fc.checksum = '{checksum}' AND fc.checktype = 'md5'
-                    AND f.size > {size}
+                    AND fc.checksize > {size}
                     AND fs.status = 'detection'
                 """
                 cursor.execute(query)
diff --git a/pagination.py b/pagination.py
index 1310bbe..b31a857 100644
--- a/pagination.py
+++ b/pagination.py
@@ -154,11 +154,14 @@ def create_page(filename, results_per_page, records_table, select_query, order,
                         # Filter textbox
                         filter_value = request.args.get(key, "")
 
-            if filename in ['games_list', 'user_games_list']:
-                html += f"<tr class='games_list' onclick='hyperlink(\"fileset?id={row['fileset']}\")'>\n"
+            if records_table != "log":
+                fileset_id = row['fileset']
+                html += f"<tr class='games_list' onclick='hyperlink(\"fileset?id={fileset_id}\")'>\n"
+                html += f"<td><a href='fileset?id={fileset_id}'>{counter}.</a></td>\n"
             else:
                 html += "<tr>\n"
-            html += f"<td>{counter}.</td>\n"
+                html += f"<td>{counter}.</td>\n"
+
             for key, value in row.items():
                 if key == 'fileset':
                     continue


Commit: 7c4d879a41512adf4638fb21e950c6c948d24a34
    https://github.com/scummvm/scummvm-sites/commit/7c4d879a41512adf4638fb21e950c6c948d24a34
Author: InariInDream (inariindream at 163.com)
Date: 2024-07-31T19:40:41+08:00

Commit Message:
INTEGRITY: Fix searching error in fileset search page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index f976f9d..0aee0f4 100644
--- a/fileset.py
+++ b/fileset.py
@@ -857,7 +857,7 @@ def fileset_search():
     """
     order = "ORDER BY fileset.id"
     filters = {
-        "fileset": "fileset",
+        "id": "fileset",
         "name": "game",
         "extra": "game",
         "platform": "game",


Commit: b61bc9935289be490703f8aeb97d71685f89707c
    https://github.com/scummvm/scummvm-sites/commit/b61bc9935289be490703f8aeb97d71685f89707c
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-01T18:24:57+08:00

Commit Message:
INTEGRITY: Change apache2 conf

Changed paths:
    apache2-config/gamesdb.sev.zone.conf


diff --git a/apache2-config/gamesdb.sev.zone.conf b/apache2-config/gamesdb.sev.zone.conf
index 9578a31..8b37f5b 100644
--- a/apache2-config/gamesdb.sev.zone.conf
+++ b/apache2-config/gamesdb.sev.zone.conf
@@ -2,22 +2,14 @@
     ServerName gamesdb.sev.zone
     ServerAlias www.gamesdb.sev.zone
     ServerAdmin webmaster at localhost
-    DocumentRoot /var/www/vhosts.d/gamesdb.sev.zone/htdocs/
-    ErrorLog /var/www/vhosts.d/gamesdb.sev.zone/logs/error.log
-    CustomLog /var/www/vhosts.d/gamesdb.sev.zone/logs/access.log combined
-    <Directory /var/www/vhosts.d/gamesdb.sev.zone/htdocs>
-        php_admin_value open_basedir "/var/www/vhosts.d/gamesdb.sev.zone/"
+    CustomLog ${APACHE_LOG_DIR}/integrity-access.log combined
+    ErrorLog ${APACHE_LOG_DIR}/integrity-error.log
+    DocumentRoot /home/ubuntu/projects/python/scummvm-sites
+    WSGIDaemonProcess scummvm-sites user=www-data group=www-data threads=5
+    WSGIScriptAlias / /home/ubuntu/projects/python/scummvm-sites/app.wsgi
+
+    <Directory /home/ubuntu/projects/python/scummvm-sites>
+        Require all granted
     </Directory>
-    <Files "mysql_config.json">
-        Order allow,deny
-        Deny from all
-    </Files>
-    <Files "schema.php">
-        Order allow,deny
-        Deny from all
-    </Files>
-    <Files "dat_parser.php">
-        Order allow,deny
-        Deny from all
-    </Files>
+
 </VirtualHost>


Commit: 48299ac8b41767ecaaafc03d86dcb6896e7ee809
    https://github.com/scummvm/scummvm-sites/commit/48299ac8b41767ecaaafc03d86dcb6896e7ee809
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-01T18:25:34+08:00

Commit Message:
INTEGRITY: Return Unknown when no file matching

Changed paths:
    db_functions.py
    fileset.py


diff --git a/db_functions.py b/db_functions.py
index 3c94b22..efd8e0f 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -461,7 +461,8 @@ def merge_filesets(detection_id, dat_id):
         conn.rollback()
         print(f"Error merging filesets: {e}")
     finally:
-        conn.close()
+        # conn.close()
+        pass
 
     return history_last
 
@@ -725,8 +726,8 @@ def populate_file(fileset, fileset_id, conn, detection):
             else:
                 cursor.execute(f"UPDATE file SET detection_type = 'None' WHERE id = {file_id}")
 
-def insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user):
-    if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=user):
+def insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user, ip=''):
+    if insert_fileset(src, detection, key, megakey, transaction_id, log_text, conn, username=user, ip=ip):
         for file in fileset["rom"]:
             insert_file(file, detection, src, conn)
             for key, value in file.items():
@@ -747,9 +748,9 @@ def finalize_fileset_insertion(conn, transaction_id, src, filepath, author, vers
         if src != 'user':
             log_text = f"Completed loading DAT file, filename {filepath}, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Number of filesets: {fileset_insertion_count}. Transaction: {transaction_id}"
             create_log(escape_string(category_text), user, escape_string(log_text), conn)
-    conn.close()
+    # conn.close()
 
-def user_integrity_check(data):
+def user_integrity_check(data, ip):
     src = "user"
     source_status = src
     new_files = []
@@ -823,7 +824,7 @@ def user_integrity_check(data):
                         if target_file['name'] not in matched_names:
                             missing_set.add(target_file['name'])
                         else:
-                            missing_set.remove(target_file['name'])
+                            missing_set.discard(target_file['name'])
                     else:
                         matched_names.add(target_file['name'])  
                 
@@ -863,7 +864,7 @@ def user_integrity_check(data):
                 add_usercount(matched_fileset_id, conn)
                 log_matched_fileset(src, matched_fileset_id, matched_fileset_id, 'user', user, conn)
             else:
-                insert_new_fileset(data, conn, None, src, key, None, transaction_id, log_text, user)
+                insert_new_fileset(data, conn, None, src, key, None, transaction_id, log_text, user, ip)
             finalize_fileset_insertion(conn, transaction_id, src, None, user, 0, source_status, user)
     except Exception as e:
         conn.rollback()
@@ -872,9 +873,13 @@ def user_integrity_check(data):
         category_text = f"Uploaded from {src}"
         log_text = f"Completed loading file, State {source_status}. Transaction: {transaction_id}"
         create_log(escape_string(category_text), user, escape_string(log_text), conn)
-        conn.close()
+        # conn.close()
     return matched_map, missing_map, extra_map
 
 def add_usercount(fileset, conn):
     with conn.cursor() as cursor:
         cursor.execute(f"UPDATE fileset SET user_count = COALESCE(user_count, 0) + 1 WHERE id = {fileset}")
+        cursor.execute(f"SELECT user_count from fileset WHERE id = {fileset}")
+        count = cursor.fetchone()['user_count']
+        if count >= 3:
+            cursor.execute(f"UPDATE fileset SET status = 'ReadyForReview' WHERE id = {fileset}")
\ No newline at end of file
diff --git a/fileset.py b/fileset.py
index 0aee0f4..184bc65 100644
--- a/fileset.py
+++ b/fileset.py
@@ -734,14 +734,21 @@ def validate():
         return jsonify(json_response)
 
     try:
-        matched_map, missing_map, extra_map = user_integrity_check(json_object)
+        matched_map, missing_map, extra_map = user_integrity_check(json_object, ip)
     except Exception as e:
-        json_response['error'] = 1
+        json_response['error'] = -1
         json_response['status'] = 'processing_error'
+        json_response['fileset'] = 'unknown_fileset'
         json_response['message'] = str(e)
         print(f"Response: {json_response}")
         return jsonify(json_response)
     print(f"Matched: {matched_map}")
+    print(len(matched_map))
+    if (len(matched_map) == 0):
+        json_response['error'] = error_codes['unknown']
+        json_response['status'] = 'unknown_fileset'
+        json_response['fileset'] = 'unknown_fileset'
+        return jsonify(json_response)
     matched_map = list(sorted(matched_map.items(), key=lambda x: len(x[1]), reverse=True))[0]
     matched_id = matched_map[0]
     # find the same id in the missing_map and extra_map


Commit: 97e2371851edfb88b9326fe722c52c1b98562234
    https://github.com/scummvm/scummvm-sites/commit/97e2371851edfb88b9326fe722c52c1b98562234
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-03T16:37:42+08:00

Commit Message:
INTEGRITY: Delete redundant upload route

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 184bc65..9d502e9 100644
--- a/fileset.py
+++ b/fileset.py
@@ -874,132 +874,7 @@ def fileset_search():
     }
     return render_template_string(create_page(filename, 25, records_table, select_query, order, filters))
 
- at app.route('/upload', methods=['GET'])
-def upload_page():
-    html = """
-    <!DOCTYPE html>
-    <html>
-    <head>
-        <title>Upload Game Integrity Check</title>
-    </head>
-    <body>
-        <h2>Upload Your Game Integrity Check (JSON)</h2>
-        <form action="/upload" method="post" enctype="multipart/form-data">
-            <input type="file" name="file" accept=".json" required>
-            <input type="submit" value="Upload">
-        </form>
-        {{ get_flashed_messages() }}
-    </body>
-    </html>
-    """
-    return render_template_string(html)
-
- at app.route('/upload', methods=['POST'])
-def upload_file():
-    if 'file' not in request.files:
-        flash('No file part')
-        return redirect(request.url)
-    
-    file = request.files['file']
-    
-    if file.filename == '':
-        flash('No selected file')
-        return redirect(request.url)
-    
-    if file and file.filename.endswith('.json'):
-        matched_map = {}
-        missing_map = {}
-        extra_map = {}
-        try:
-            data = json.load(file)
-            matched_map, missing_map, extra_map = user_integrity_check(data)
-            flash('File successfully uploaded and processed')
-        except Exception as e:
-            flash(f'Error processing file: {e}')
-        finally:
-            html = """
-            <!DOCTYPE html>
-            <html>
-            <head>
-                <link rel="stylesheet" type="text/css" href="{{ url_for('static', filename='style.css') }}">
-            </head>
-            <body>
-                <h2>Upload Game Integrity Check</h2>
-                <form action="/upload" method="post" enctype="multipart/form-data">
-                    <input type="file" name="file" accept=".json" required>
-                    <input type="submit" value="Upload">
-                </form>
-                <h2>Results</h2>
-                <h3>Matched Filesets</h3>
-                <table>
-                <thead>
-                    <tr>
-                        <th>Fileset ID</th>
-                        <th>Match Count</th>
-                    </tr>
-                </thead>
-                <tbody>
-            """
 
-            for fileset_id, count in matched_map.items():
-                html += f"""
-                <tr>
-                    <td><a href='fileset?id={fileset_id}'>{fileset_id}</a></td>
-                    <td>{count}</td>
-                </tr>
-                """
-            
-            html += """
-                </tbody>
-                </table>
-                <h3>Missing Filesets</h3>
-                <table>
-                <thead>
-                    <tr>
-                        <th>Fileset ID</th>
-                        <th>Missing Count</th>
-                    </tr>
-                </thead>
-                <tbody>
-            """
-            
-            for fileset_id, count in missing_map.items():
-                html += f"""
-                <tr>
-                    <td><a href='fileset?id={fileset_id}'>{fileset_id}</a></td>
-                    <td>{len(count)}</td>
-                </tr>
-                """
-            
-            html += """
-                </tbody>
-                </table>
-                <h3>Extra Filesets</h3>
-                <table>
-                <thead>
-                    <tr>
-                        <th>Fileset ID</th>
-                        <th>Extra Count</th>
-                    </tr>
-                </thead>
-                <tbody>
-            """
-            
-            for fileset_id, count in extra_map.items():
-                html += f"""
-                <tr>
-                    <td><a href='fileset?id={fileset_id}'>{fileset_id}</a></td>
-                    <td>{len(count)}</td>
-                </tr>
-                """
-            
-            html += """
-                </tbody>
-                </table>
-            </body>
-            </html>
-            """
-        return render_template_string(html)
 
 if __name__ == '__main__':
     app.secret_key = secret_key


Commit: efa299e308cd176f92a03f2ab2968623caf868aa
    https://github.com/scummvm/scummvm-sites/commit/efa299e308cd176f92a03f2ab2968623caf868aa
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-03T18:13:39+08:00

Commit Message:
INTEGRITY: Redirect user_games_list to fileset_search page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 9d502e9..3b50967 100644
--- a/fileset.py
+++ b/fileset.py
@@ -734,7 +734,7 @@ def validate():
         return jsonify(json_response)
 
     try:
-        matched_map, missing_map, extra_map = user_integrity_check(json_object, ip)
+        matched_map, missing_map, extra_map = user_integrity_check(json_object, ip, game_metadata)
     except Exception as e:
         json_response['error'] = -1
         json_response['status'] = 'processing_error'
@@ -783,32 +783,8 @@ def validate():
     
 @app.route('/user_games_list')
 def user_games_list():
-    filename = "user_games_list"
-    records_table = "fileset"
-    select_query = """
-    SELECT engineid, gameid, extra, platform, language, game.name,
-    status, fileset.id as fileset
-    FROM fileset
-    LEFT JOIN game ON game.id = fileset.game
-    LEFT JOIN engine ON engine.id = game.engine
-    WHERE status = 'user'
-    """
-    order = "ORDER BY gameid"
-    filters = {
-        "engineid": "engine",
-        "gameid": "game",
-        "extra": "game",
-        "platform": "game",
-        "language": "game",
-        "name": "game",
-        "status": "fileset"
-    }
-    mapping = {
-        'engine.id': 'game.engine',
-        'game.id': 'fileset.game',
-    }
-    return render_template_string(create_page(filename, 200, records_table, select_query, order, filters, mapping))
-
+    url = f"fileset_search?extra=&platform=&language=&megakey=&status=user"
+    return redirect(url)
 
 @app.route('/games_list')
 def games_list():
@@ -857,7 +833,7 @@ def fileset_search():
     filename = "fileset_search"
     records_table = "fileset"
     select_query = """
-    SELECT extra, platform, language, game.name, megakey,
+    SELECT extra, platform, language, game.gameid, megakey,
     status, fileset.id as fileset
     FROM fileset
     JOIN game ON game.id = fileset.game


Commit: 56e5cbcc8e44f40188dca196d20b2619fb5644eb
    https://github.com/scummvm/scummvm-sites/commit/56e5cbcc8e44f40188dca196d20b2619fb5644eb
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-03T19:57:56+08:00

Commit Message:
INTEGRITY: Insert metadata when insertinga user fileset

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index efd8e0f..2eeeaf7 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -78,6 +78,9 @@ def insert_fileset(src, detection, key, megakey, transaction, log_text, conn, ip
     if detection:
         status = "detection"
         game = "@game_last"
+        
+    if status == "user":
+        game = "@game_last"
 
     # Check if key/megakey already exists, if so, skip insertion (no quotes on purpose)
     if detection:
@@ -750,7 +753,7 @@ def finalize_fileset_insertion(conn, transaction_id, src, filepath, author, vers
             create_log(escape_string(category_text), user, escape_string(log_text), conn)
     # conn.close()
 
-def user_integrity_check(data, ip):
+def user_integrity_check(data, ip, game_metadata=None):
     src = "user"
     source_status = src
     new_files = []
@@ -845,8 +848,21 @@ def user_integrity_check(data, ip):
                 
                 for extra in extra_set:
                     extra_map[fileset_id].append({'name': extra})
-            
+            if game_metadata:
+                platform = game_metadata['platform']
+                lang = game_metadata['language']
+                gameid = game_metadata['gameid']
+                engineid = game_metadata['engineid']
+                extra_info = game_metadata['extra']
+                engine_name = " "
+                title = " "
+                insert_game(engine_name, engineid, title, gameid, extra_info, platform, lang, conn)
+                
             # handle different scenarios
+            if len(matched_map) == 0:   
+                insert_new_fileset(data, conn, None, src, key, None, transaction_id, log_text, user, ip)
+                return matched_map, missing_map, extra_map
+
             matched_list = sorted(matched_map.items(), key=lambda x: len(x[1]), reverse=True)
             most_matched = matched_list[0] 
             matched_fileset_id, matched_count = most_matched[0], most_matched[1]   


Commit: 97aa67970a309b2057711d497a97490df484c31d
    https://github.com/scummvm/scummvm-sites/commit/97aa67970a309b2057711d497a97490df484c31d
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-05T20:16:22+08:00

Commit Message:
INTEGRITY: Add ready_for_review page

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 3b50967..ac7b6c0 100644
--- a/fileset.py
+++ b/fileset.py
@@ -786,6 +786,11 @@ def user_games_list():
     url = f"fileset_search?extra=&platform=&language=&megakey=&status=user"
     return redirect(url)
 
+ at app.route('/ready_for_review') 
+def ready_for_review():
+    url = f"fileset_search?extra=&platform=&language=&megakey=&status=ReadyForReview"
+    return redirect(url)
+
 @app.route('/games_list')
 def games_list():
     filename = "games_list"


Commit: e37205a4d6469908578aeed1961504c1cfd7f338
    https://github.com/scummvm/scummvm-sites/commit/e37205a4d6469908578aeed1961504c1cfd7f338
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-05T20:17:11+08:00

Commit Message:
INTEGRITY: Add "mark as full" button at fileset page

Changed paths:
    fileset.py
    pagination.py


diff --git a/fileset.py b/fileset.py
index ac7b6c0..9e4a463 100644
--- a/fileset.py
+++ b/fileset.py
@@ -7,7 +7,7 @@ from user_fileset_functions import user_calc_key, file_json_to_array, user_inser
 from pagination import create_page
 import difflib
 from pymysql.converters import escape_string
-from db_functions import find_matching_filesets, get_all_related_filesets, convert_log_text_to_links, user_integrity_check, db_connect
+from db_functions import find_matching_filesets, get_all_related_filesets, convert_log_text_to_links, user_integrity_check, db_connect,create_log
 from collections import defaultdict
 
 app = Flask(__name__)
@@ -43,7 +43,7 @@ def index():
     <ul>
         <li><a href="{{ url_for('fileset') }}">Fileset</a></li>
         <li><a href="{{ url_for('user_games_list') }}">User Games List</a></li>
-        <li><a href="{{ url_for('games_list') }}">Games List</a></li>
+        <li><a href="{{ url_for('ready_for_review') }}">Ready for review</a></li>
         <li><a href="{{ url_for('fileset_search') }}">Fileset Search</a></li>
     </ul>
     <h2>Logs</h2>
@@ -111,9 +111,13 @@ def fileset():
             <h2><u>Fileset: {id}</u></h2>
             <table>
             """
-            html += f"<td><button onclick=\"location.href='/fileset/{id}/merge'\">Manual Merge</button></td>"
-            html += f"<td><button onclick=\"location.href='/fileset/{id}/match'\">Match and Merge</button></td>"
-
+            html += f"<button type='button' onclick=\"location.href='/fileset/{id}/merge'\">Manual Merge</button>"
+            html += f"<button type='button' onclick=\"location.href='/fileset/{id}/match'\">Match and Merge</button>"
+            html += f"""
+                    <form action="/fileset/{id}/mark_full" method="post" style="display:inline;">
+                        <button type='submit'>Mark as full</button>
+                    </form>
+                    """
             cursor.execute(f"SELECT * FROM fileset WHERE id = {id}")
             result = cursor.fetchone()
             print(result)
@@ -685,6 +689,23 @@ def execute_merge(id, source=None, target=None):
 
     finally:
         connection.close()
+        
+ at app.route('/fileset/<int:id>/mark_full', methods=['POST'])
+def mark_as_full(id):
+    try:
+        conn = db_connect()
+        with conn.cursor() as cursor:
+            update_query = f"UPDATE fileset SET status = 'full' WHERE id = {id}"
+            cursor.execute(update_query)
+            create_log("Manual from Web", "Dev", f"Marked Fileset:{id} as full", conn)
+            conn.commit()
+    except Exception as e:
+        print(f"Error updating fileset status: {e}")
+        return jsonify({'error': 'Failed to mark fileset as full'}), 500
+    finally:
+        conn.close()
+
+    return redirect(f'/fileset?id={id}')
 
 @app.route('/validate', methods=['POST'])
 def validate():
diff --git a/pagination.py b/pagination.py
index b31a857..f6cf6b1 100644
--- a/pagination.py
+++ b/pagination.py
@@ -74,7 +74,7 @@ def create_page(filename, results_per_page, records_table, select_query, order,
         else:
             cursor.execute(f"SELECT COUNT(id) FROM {records_table}")
             num_of_results = cursor.fetchone()['COUNT(id)']
-            
+        # TODO: Recalculate num_of_results if filters are applied
         num_of_pages = (num_of_results + results_per_page - 1) // results_per_page
         print(f"Num of results: {num_of_results}, Num of pages: {num_of_pages}")
         if num_of_results == 0:


Commit: d38458454073db229594ce9703015147c619e246
    https://github.com/scummvm/scummvm-sites/commit/d38458454073db229594ce9703015147c619e246
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-06T19:24:30+08:00

Commit Message:
INTEGRITY: improve the matching between `set` and `detection`

Changed paths:
    db_functions.py
    fileset.py


diff --git a/db_functions.py b/db_functions.py
index 2eeeaf7..9fada2a 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -556,7 +556,8 @@ def match_fileset(data_arr, username=None):
 
     with conn.cursor() as cursor:
         cursor.execute("SELECT MAX(`transaction`) FROM transactions")
-        transaction_id = cursor.fetchone()['MAX(`transaction`)'] + 1
+        transaction_id = cursor.fetchone()['MAX(`transaction`)']
+        transaction_id = transaction_id + 1 if transaction_id else 1
 
     category_text = f"Uploaded from {src}"
     log_text = f"Started loading DAT file, size {os.path.getsize(filepath)}, author {author}, version {version}. State {source_status}. Transaction: {transaction_id}"
@@ -611,7 +612,7 @@ def find_matching_filesets(fileset, conn, status):
         for file in fileset["rom"]:
             matched_set = set()
             for key, value in file.items():
-                if key not in ["name", "size"]:
+                if key not in ["name", "size", "sha1", "crc"]:
                     checksum = file[key]
                     checktype = key
                     checksize, checktype, checksum = get_checksum_props(checktype, checksum)
@@ -734,7 +735,7 @@ def insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_
         for file in fileset["rom"]:
             insert_file(file, detection, src, conn)
             for key, value in file.items():
-                if key not in ["name", "size"]:
+                if key not in ["name", "size", "sha1", "crc"]:
                     insert_filechecksum(file, key, conn)
 
 def log_matched_fileset(src, fileset_last, fileset_id, state, user, conn):
diff --git a/fileset.py b/fileset.py
index 9e4a463..72bf59e 100644
--- a/fileset.py
+++ b/fileset.py
@@ -600,7 +600,7 @@ def execute_merge(id, source=None, target=None):
             cursor.execute(f"SELECT * FROM fileset WHERE id = {target_id}")
             target_fileset = cursor.fetchone()
 
-            if source_fileset['src'] == 'detection':
+            if source_fileset['status'] == 'detection':
                 cursor.execute(f"""
                 UPDATE fileset SET
                     game = '{source_fileset['game']}',
@@ -633,11 +633,10 @@ def execute_merge(id, source=None, target=None):
                         INSERT INTO filechecksum (file, checksize, checktype, checksum)
                         VALUES ({new_file_id}, '{checksum['checksize']}', '{checksum['checktype']}', '{checksum['checksum']}')
                         """)
-
-            elif source_fileset['src'] == 'scan':
+            elif source_fileset['status'] in ['scan', 'dat']:
                 cursor.execute(f"""
                 UPDATE fileset SET
-                    status = '{source_fileset['status']}',
+                    status = '{source_fileset['status'] if source_fileset['status'] != 'dat' else "partial"}',
                     `key` = '{source_fileset['key']}',
                     `timestamp` = '{source_fileset['timestamp']}'
                 WHERE id = {target_id}


Commit: 1eb60bce71c34f17b9d11c91f6308defc181104d
    https://github.com/scummvm/scummvm-sites/commit/1eb60bce71c34f17b9d11c91f6308defc181104d
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-07T19:49:30+08:00

Commit Message:
INTEGRITY: Fix the calculation of result page

Changed paths:
    pagination.py


diff --git a/pagination.py b/pagination.py
index f6cf6b1..cd4f9d9 100644
--- a/pagination.py
+++ b/pagination.py
@@ -64,17 +64,17 @@ def create_page(filename, results_per_page, records_table, select_query, order,
                         continue
                     from_query += f" JOIN {table} ON {get_join_columns(records_table, table, mapping)}"
 
-            cursor.execute(f"SELECT COUNT({records_table}.id) AS count FROM {records_table}")
+            cursor.execute(f"SELECT COUNT({records_table}.id) AS count FROM {records_table} {condition}")
             num_of_results = cursor.fetchone()['count']
             
         elif "JOIN" in records_table:
             first_table = records_table.split(" ")[0]
-            cursor.execute(f"SELECT COUNT({first_table}.id) FROM {records_table}")
+            cursor.execute(f"SELECT COUNT({first_table}.id) FROM {records_table} {condition}")
             num_of_results = cursor.fetchone()[f'COUNT({first_table}.id)']
         else:
-            cursor.execute(f"SELECT COUNT(id) FROM {records_table}")
+            cursor.execute(f"SELECT COUNT(id) FROM {records_table} {condition}")
             num_of_results = cursor.fetchone()['COUNT(id)']
-        # TODO: Recalculate num_of_results if filters are applied
+
         num_of_pages = (num_of_results + results_per_page - 1) // results_per_page
         print(f"Num of results: {num_of_results}, Num of pages: {num_of_pages}")
         if num_of_results == 0:
@@ -210,5 +210,4 @@ def create_page(filename, results_per_page, records_table, select_query, order,
         html += "<input type='submit' value='Submit'>"
         html += "</div></form>"
 
-    return html
-    
\ No newline at end of file
+    return html
\ No newline at end of file


Commit: 9dbb60ccd5907c5275f9b68bcc9f4b7e316513a3
    https://github.com/scummvm/scummvm-sites/commit/9dbb60ccd5907c5275f9b68bcc9f4b7e316513a3
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-12T19:42:34+08:00

Commit Message:
INTEGRITY: Fix the fileset_search page

Changed paths:
    fileset.py
    pagination.py


diff --git a/fileset.py b/fileset.py
index 72bf59e..cf46612 100644
--- a/fileset.py
+++ b/fileset.py
@@ -866,15 +866,17 @@ def fileset_search():
     order = "ORDER BY fileset.id"
     filters = {
         "id": "fileset",
-        "name": "game",
+        "gameid": "game",
         "extra": "game",
         "platform": "game",
         "language": "game",
         "megakey": "fileset",
         "status": "fileset"
     }
-    return render_template_string(create_page(filename, 25, records_table, select_query, order, filters))
-
+    mapping = {
+        'game.id': 'fileset.game',
+    }
+    return render_template_string(create_page(filename, 25, records_table, select_query, order, filters, mapping))
 
 
 if __name__ == '__main__':
diff --git a/pagination.py b/pagination.py
index cd4f9d9..57caead 100644
--- a/pagination.py
+++ b/pagination.py
@@ -63,16 +63,15 @@ def create_page(filename, results_per_page, records_table, select_query, order,
                     if table == records_table:
                         continue
                     from_query += f" JOIN {table} ON {get_join_columns(records_table, table, mapping)}"
-
-            cursor.execute(f"SELECT COUNT({records_table}.id) AS count FROM {records_table} {condition}")
+            cursor.execute(f"SELECT COUNT({records_table}.id) AS count FROM {from_query} {condition}")
             num_of_results = cursor.fetchone()['count']
             
         elif "JOIN" in records_table:
             first_table = records_table.split(" ")[0]
-            cursor.execute(f"SELECT COUNT({first_table}.id) FROM {records_table} {condition}")
+            cursor.execute(f"SELECT COUNT({first_table}.id) FROM {records_table}")
             num_of_results = cursor.fetchone()[f'COUNT({first_table}.id)']
         else:
-            cursor.execute(f"SELECT COUNT(id) FROM {records_table} {condition}")
+            cursor.execute(f"SELECT COUNT(id) FROM {records_table}")
             num_of_results = cursor.fetchone()['COUNT(id)']
 
         num_of_pages = (num_of_results + results_per_page - 1) // results_per_page
@@ -210,4 +209,4 @@ def create_page(filename, results_per_page, records_table, select_query, order,
         html += "<input type='submit' value='Submit'>"
         html += "</div></form>"
 
-    return html
\ No newline at end of file
+    return html


Commit: 8db2c9195ca24b1649a3900241f0f0c3b6592e3b
    https://github.com/scummvm/scummvm-sites/commit/8db2c9195ca24b1649a3900241f0f0c3b6592e3b
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-13T20:44:21+08:00

Commit Message:
INTEGRITY: Fix bugs of matching

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 9fada2a..9e20bcd 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -634,9 +634,10 @@ def find_matching_filesets(fileset, conn, status):
     return matched_map
 
 def matching_set(fileset, conn):
-    matched_map = defaultdict(int)
+    matched_map = defaultdict(list)
     with conn.cursor() as cursor:
         for file in fileset["rom"]:
+            matched_set = set()
             if "md5" in file:
                 checksum = file["md5"]
                 size = file["size"]
@@ -653,8 +654,9 @@ def matching_set(fileset, conn):
                 records = cursor.fetchall()
                 if records:
                     for record in records:
-                        matched_map[record['fileset_id']] += 1
-                    break
+                        matched_set.add(record['fileset_id'])
+            for id in matched_set:
+                matched_map[id].append(file)
     return matched_map
 
 def handle_matched_filesets(fileset_last, matched_map, fileset, conn, detection, src, key, megakey, transaction_id, log_text, user):
@@ -711,7 +713,7 @@ def populate_file(fileset, fileset_id, conn, detection):
                 target_files_dict[target_file['id']] = f"{checksum['checktype']}-{checksum['checksize']}"
         for file in fileset['rom']:
             file_exists = False
-            cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['md5']}', {fileset_id}, {0}, NOW())")
+            cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['md5'] if file.get('md5') is not None else 'None'}', {fileset_id}, {0}, NOW())")
             cursor.execute("SET @file_last = LAST_INSERT_ID()")
             cursor.execute("SELECT @file_last AS file_id")
             file_id = cursor.fetchone()['file_id']


Commit: a4a67f111d3119ce3ab373b0819b2c8cc4d2fadd
    https://github.com/scummvm/scummvm-sites/commit/a4a67f111d3119ce3ab373b0819b2c8cc4d2fadd
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-14T20:16:40+08:00

Commit Message:
INTEGRITY: Fix dups of log

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 9e20bcd..e9c816d 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -677,20 +677,18 @@ def handle_matched_filesets(fileset_last, matched_map, fileset, conn, detection,
                 populate_file(fileset, matched_fileset_id, conn, detection)
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'full' if src != "dat" else "partial", user, conn)
             elif status == 'full' and len(fileset['rom']) == count:
-                is_full_matched == True
+                is_full_matched = True
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'full', user, conn)
                 return
-            elif (status == 'partial' or status == 'dat') and count == len(matched_count):
+            elif (status == 'partial') and count == len(matched_count):
+                is_full_matched = True
                 update_fileset_status(cursor, matched_fileset_id, 'full')
                 populate_file(fileset, matched_fileset_id, conn, detection)
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'full', user, conn)
             elif status == 'scan' and count == len(matched_count):
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'full', user, conn)
-                return
             elif src == 'dat':
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'partial matched', user, conn)
-            else:
-                insert_new_fileset(fileset, conn, detection, src, key, megakey, transaction_id, log_text, user)
 
 def update_fileset_status(cursor, fileset_id, status):
     cursor.execute(f"""


Commit: cdb7c4c4805c85a167a55a499983d7e36e74b41d
    https://github.com/scummvm/scummvm-sites/commit/cdb7c4c4805c85a167a55a499983d7e36e74b41d
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-15T20:46:38+08:00

Commit Message:
INTEGRITY: Highlight the detection checksums

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index cf46612..a46e33b 100644
--- a/fileset.py
+++ b/fileset.py
@@ -200,7 +200,10 @@ def fileset():
                 for column in all_columns:
                     if column != 'id':
                         value = row.get(column, '')
-                        html += f"<td>{value}</td>\n"
+                        if column == row.get('detection_type') and row.get('detection') == 1:
+                            html += f"<td style='background-color: yellow;'>{value}</td>\n"
+                        else:
+                            html += f"<td>{value}</td>\n"
                 html += "</tr>\n"
                 counter += 1
             html += "</table>\n"


Commit: 09d4d06c779f0969f14cfa3e652a3a48b9993dfc
    https://github.com/scummvm/scummvm-sites/commit/09d4d06c779f0969f14cfa3e652a3a48b9993dfc
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-15T20:47:34+08:00

Commit Message:
INTEGRITY: Add sorting to the fileset details

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index a46e33b..e53e381 100644
--- a/fileset.py
+++ b/fileset.py
@@ -159,7 +159,22 @@ def fileset():
             # Table
             html += "<table>\n"
 
-            cursor.execute(f"SELECT file.id, name, size, checksum, detection, detection_type, `timestamp` FROM file WHERE fileset = {id}")
+            sort = request.args.get('sort')
+            order = ''
+            md5_columns = ['md5-t-5000', 'md5-0', 'md5-5000', 'md5-1M']
+            share_columns = ['name', 'size', 'checksum', 'detection', 'detection_type', 'timestamp']
+
+            if sort:
+                column = sort.split('-')[0]
+                valid_columns = share_columns + md5_columns
+                print(column, valid_columns)
+                if column in valid_columns:
+                    order = f"ORDER BY {column}"
+                    if 'desc' in sort:
+                        order += " DESC"
+            columns_to_select = "file.id, name, size, checksum, detection, detection_type, `timestamp`"
+            columns_to_select += ", ".join(md5_columns)
+            cursor.execute(f"SELECT file.id, name, size, checksum, detection, detection_type, `timestamp` FROM file WHERE fileset = {id} {order}")
             result = cursor.fetchall()
 
             all_columns = list(result[0].keys()) if result else []


Commit: 337d92db26676e9fe688716801669998f84e35e6
    https://github.com/scummvm/scummvm-sites/commit/337d92db26676e9fe688716801669998f84e35e6
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-15T20:48:24+08:00

Commit Message:
INTEGRITY: Add  checkbox next to each file

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index e53e381..5cead6f 100644
--- a/fileset.py
+++ b/fileset.py
@@ -144,10 +144,17 @@ def fileset():
 
             # Files in the fileset
             html += "<h3>Files in the fileset</h3>"
-            html += "<form>"
+            # delete button
+            html += "<form method='POST'>"
+            html += "<input type='hidden' name='delete' value='true' />"
+            html += "<input type='submit' value='Delete Selected Files' />"
+            html += "<table>\n"
+
+            # Hidden inputs for preserving other parameters
             for k, v in request.args.items():
                 if k != 'widetable':
                     html += f"<input type='hidden' name='{k}' value='{v}'>"
+
             if widetable == 'true':
                 html += "<input class='hidden' type='text' name='widetable' value='false' />"
                 html += "<input type='submit' value='Hide extra checksums' />"
@@ -203,15 +210,23 @@ def fileset():
             # Generate table header
             html += "<tr>\n"
             html += "<th/>"  # Numbering column
-            for column in all_columns:
-                if column != 'id':
-                    html += f"<th>{column}</th>\n"
+            html += "<th>Select</th>\n"  # New column for selecting files
+            sortable_columns = share_columns + list(temp_set)
+
+            for column in sortable_columns:
+                if column not in ['id']:
+                    vars = "&".join([f"{k}={v}" for k, v in request.args.items() if k != 'sort'])
+                    sort_link = f"{column}"
+                    if sort == column:
+                        sort_link += "-desc"
+                    html += f"<th><a href='/fileset?id={id}&{vars}&sort={sort_link}'>{column}</a></th>\n"
             html += "</tr>\n"
 
             # Generate table rows
             for row in result:
                 html += "<tr>\n"
                 html += f"<td>{counter}.</td>\n"
+                html += f"<td><input type='checkbox' name='file_ids' value='{row['id']}' /></td>\n"  # Checkbox for selecting file
                 for column in all_columns:
                     if column != 'id':
                         value = row.get(column, '')
@@ -221,7 +236,10 @@ def fileset():
                             html += f"<td>{value}</td>\n"
                 html += "</tr>\n"
                 counter += 1
+
             html += "</table>\n"
+            html += "<input type='submit' value='Delete Selected Files' />"
+            html += "</form>\n"
 
             # Generate the HTML for the developer actions
             html += "<h3>Developer Actions</h3>"
@@ -896,6 +914,20 @@ def fileset_search():
     }
     return render_template_string(create_page(filename, 25, records_table, select_query, order, filters, mapping))
 
+ at app.route('/delete_files', methods=['POST'])
+def delete_files():
+    file_ids = request.form.getlist('file_ids')
+    if file_ids:
+        # Convert the list to comma-separated string for SQL
+        ids_to_delete = ",".join(file_ids)
+        connection = db_connect()
+        with connection.cursor() as cursor:
+            # SQL statements to delete related records
+            cursor.execute(f"DELETE FROM filechecksum WHERE file IN ({ids_to_delete})")
+            cursor.execute(f"DELETE FROM file WHERE id IN ({ids_to_delete})")
+
+            # Commit the deletions
+            connection.commit()
 
 if __name__ == '__main__':
     app.secret_key = secret_key


Commit: 9e062fb2278829441c21f4ad464688954ce00c3f
    https://github.com/scummvm/scummvm-sites/commit/9e062fb2278829441c21f4ad464688954ce00c3f
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-15T20:57:34+08:00

Commit Message:
INTEGRITY: Add  checkbox next to each file

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index e53e381..8c4de68 100644
--- a/fileset.py
+++ b/fileset.py
@@ -167,13 +167,14 @@ def fileset():
             if sort:
                 column = sort.split('-')[0]
                 valid_columns = share_columns + md5_columns
-                print(column, valid_columns)
                 if column in valid_columns:
                     order = f"ORDER BY {column}"
                     if 'desc' in sort:
                         order += " DESC"
+
             columns_to_select = "file.id, name, size, checksum, detection, detection_type, `timestamp`"
             columns_to_select += ", ".join(md5_columns)
+            print(f"SELECT file.id, name, size, checksum, detection, detection_type, `timestamp` FROM file WHERE fileset = {id} {order}")
             cursor.execute(f"SELECT file.id, name, size, checksum, detection, detection_type, `timestamp` FROM file WHERE fileset = {id} {order}")
             result = cursor.fetchall()
 
@@ -203,15 +204,23 @@ def fileset():
             # Generate table header
             html += "<tr>\n"
             html += "<th/>"  # Numbering column
-            for column in all_columns:
+            html += "<th>Select</th>"  # Checkbox column
+            sortable_columns = share_columns + list(temp_set)
+
+            for column in sortable_columns:
                 if column != 'id':
-                    html += f"<th>{column}</th>\n"
+                    vars = "&".join([f"{k}={v}" for k, v in request.args.items() if k != 'sort'])
+                    sort_link = column
+                    if sort == column:
+                        sort_link += "-desc"
+                    html += f"<th><a href='/fileset?id={id}&{vars}&sort={sort_link}'>{column}</a></th>\n"
             html += "</tr>\n"
 
             # Generate table rows
             for row in result:
                 html += "<tr>\n"
                 html += f"<td>{counter}.</td>\n"
+                html += f"<td><input type='checkbox' name='files_to_delete' value='{row['id']}' /></td>\n"  # Checkbox for deletion
                 for column in all_columns:
                     if column != 'id':
                         value = row.get(column, '')
@@ -896,6 +905,20 @@ def fileset_search():
     }
     return render_template_string(create_page(filename, 25, records_table, select_query, order, filters, mapping))
 
+ at app.route('/delete_files', methods=['POST'])
+def delete_files():
+    file_ids = request.form.getlist('file_ids')
+    if file_ids:
+        # Convert the list to comma-separated string for SQL
+        ids_to_delete = ",".join(file_ids)
+        connection = db_connect()
+        with connection.cursor() as cursor:
+            # SQL statements to delete related records
+            cursor.execute(f"DELETE FROM filechecksum WHERE file IN ({ids_to_delete})")
+            cursor.execute(f"DELETE FROM file WHERE id IN ({ids_to_delete})")
+
+            # Commit the deletions
+            connection.commit()
 
 if __name__ == '__main__':
     app.secret_key = secret_key


Commit: 895b393b045b9123de9f29b26293c7d0f13d14b2
    https://github.com/scummvm/scummvm-sites/commit/895b393b045b9123de9f29b26293c7d0f13d14b2
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-15T21:01:17+08:00

Commit Message:
INTEGRITY: Fix bugs of widetable

Changed paths:
    fileset.py




Commit: 7b1ffb090c3357509fb187706a684846a1b0defa
    https://github.com/scummvm/scummvm-sites/commit/7b1ffb090c3357509fb187706a684846a1b0defa
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-15T21:06:50+08:00

Commit Message:
INTEGRITY: Remove the delete button

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 4bb546b..37a28b6 100644
--- a/fileset.py
+++ b/fileset.py
@@ -144,17 +144,10 @@ def fileset():
 
             # Files in the fileset
             html += "<h3>Files in the fileset</h3>"
-            # delete button
-            html += "<form method='POST'>"
-            html += "<input type='hidden' name='delete' value='true' />"
-            html += "<input type='submit' value='Delete Selected Files' />"
-            html += "<table>\n"
-
-            # Hidden inputs for preserving other parameters
+            html += "<form>"
             for k, v in request.args.items():
                 if k != 'widetable':
                     html += f"<input type='hidden' name='{k}' value='{v}'>"
-
             if widetable == 'true':
                 html += "<input class='hidden' type='text' name='widetable' value='false' />"
                 html += "<input type='submit' value='Hide extra checksums' />"
@@ -215,9 +208,9 @@ def fileset():
             sortable_columns = share_columns + list(temp_set)
 
             for column in sortable_columns:
-                if column != 'id':
+                if column not in ['id']:
                     vars = "&".join([f"{k}={v}" for k, v in request.args.items() if k != 'sort'])
-                    sort_link = column
+                    sort_link = f"{column}"
                     if sort == column:
                         sort_link += "-desc"
                     html += f"<th><a href='/fileset?id={id}&{vars}&sort={sort_link}'>{column}</a></th>\n"
@@ -227,7 +220,7 @@ def fileset():
             for row in result:
                 html += "<tr>\n"
                 html += f"<td>{counter}.</td>\n"
-                html += f"<td><input type='checkbox' name='files_to_delete' value='{row['id']}' /></td>\n"  # Checkbox for deletion
+                html += f"<td><input type='checkbox' name='file_ids' value='{row['id']}' /></td>\n"  # Checkbox for selecting file
                 for column in all_columns:
                     if column != 'id':
                         value = row.get(column, '')


Commit: ca0de3904e654f8dd37d5df9e569c9955fd9f1d5
    https://github.com/scummvm/scummvm-sites/commit/ca0de3904e654f8dd37d5df9e569c9955fd9f1d5
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-16T19:53:13+08:00

Commit Message:
INTEGRITY: Fix the delete func of fileset

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 37a28b6..9b33dfa 100644
--- a/fileset.py
+++ b/fileset.py
@@ -156,6 +156,7 @@ def fileset():
                 html += "<input type='submit' value='Expand Table' />"
             html += "</form>"
 
+            html += f"""<form method="POST" action="{url_for('delete_files', id=id)}">"""
             # Table
             html += "<table>\n"
 
@@ -908,8 +909,8 @@ def fileset_search():
     }
     return render_template_string(create_page(filename, 25, records_table, select_query, order, filters, mapping))
 
- at app.route('/delete_files', methods=['POST'])
-def delete_files():
+ at app.route('/delete_files/<int:id>', methods=['POST'])
+def delete_files(id):
     file_ids = request.form.getlist('file_ids')
     if file_ids:
         # Convert the list to comma-separated string for SQL
@@ -922,6 +923,7 @@ def delete_files():
 
             # Commit the deletions
             connection.commit()
+    return redirect(url_for('fileset', id=id))
 
 if __name__ == '__main__':
     app.secret_key = secret_key


Commit: 0e5b7c458f0526d545374b97cec6f6c561bf91a9
    https://github.com/scummvm/scummvm-sites/commit/0e5b7c458f0526d545374b97cec6f6c561bf91a9
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-17T19:45:09+08:00

Commit Message:
INTEGRITY: Delete original fileset after merging

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index e9c816d..5723095 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -676,20 +676,28 @@ def handle_matched_filesets(fileset_last, matched_map, fileset, conn, detection,
                 update_fileset_status(cursor, matched_fileset_id, 'full' if src != "dat" else "partial")
                 populate_file(fileset, matched_fileset_id, conn, detection)
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'full' if src != "dat" else "partial", user, conn)
+                delete_original_fileset(fileset_last, conn)
             elif status == 'full' and len(fileset['rom']) == count:
                 is_full_matched = True
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'full', user, conn)
+                delete_original_fileset(fileset_last, conn)
                 return
             elif (status == 'partial') and count == len(matched_count):
                 is_full_matched = True
                 update_fileset_status(cursor, matched_fileset_id, 'full')
                 populate_file(fileset, matched_fileset_id, conn, detection)
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'full', user, conn)
+                delete_original_fileset(fileset_last, conn)
             elif status == 'scan' and count == len(matched_count):
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'full', user, conn)
             elif src == 'dat':
                 log_matched_fileset(src, fileset_last, matched_fileset_id, 'partial matched', user, conn)
 
+def delete_original_fileset(fileset_id, conn):
+    with conn.cursor() as cursor:
+        cursor.execute(f"DELETE FROM file WHERE fileset = {fileset_id}")
+        cursor.execute(f"DELETE FROM fileset WHERE id = {fileset_id}")
+        
 def update_fileset_status(cursor, fileset_id, status):
     cursor.execute(f"""
         UPDATE fileset SET 


Commit: 572f8fd3d4ecc940c8252581fd3dc91ede22aa98
    https://github.com/scummvm/scummvm-sites/commit/572f8fd3d4ecc940c8252581fd3dc91ede22aa98
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-18T20:13:51+08:00

Commit Message:
INTEGRITY:  Change the text of widetable

Changed paths:
    fileset.py


diff --git a/fileset.py b/fileset.py
index 9b33dfa..497a9b6 100644
--- a/fileset.py
+++ b/fileset.py
@@ -58,7 +58,7 @@ def index():
 @app.route('/fileset', methods=['GET', 'POST'])
 def fileset():
     id = request.args.get('id', default=1, type=int)
-    widetable = request.args.get('widetable', default='false', type=str)
+    widetable = request.args.get('widetable', default='partial', type=str)
     # Load MySQL credentials from a JSON file
     base_dir = os.path.dirname(os.path.abspath(__file__))
     config_path = os.path.join(base_dir, 'mysql_config.json')
@@ -148,12 +148,12 @@ def fileset():
             for k, v in request.args.items():
                 if k != 'widetable':
                     html += f"<input type='hidden' name='{k}' value='{v}'>"
-            if widetable == 'true':
-                html += "<input class='hidden' type='text' name='widetable' value='false' />"
-                html += "<input type='submit' value='Hide extra checksums' />"
-            else:
-                html += "<input class='hidden' type='text' name='widetable' value='true' />"
+            if widetable == 'partial':
+                html += "<input class='hidden' name='widetable' value='full' />"
                 html += "<input type='submit' value='Expand Table' />"
+            else:
+                html += "<input class='hidden' name='widetable' value='partial' />"
+                html += "<input type='submit' value='Hide extra checksums' />"
             html += "</form>"
 
             html += f"""<form method="POST" action="{url_for('delete_files', id=id)}">"""
@@ -182,7 +182,7 @@ def fileset():
             all_columns = list(result[0].keys()) if result else []
             temp_set = set()
 
-            if widetable == 'true':
+            if widetable == 'full':
                 file_ids = [file['id'] for file in result]
                 cursor.execute(f"SELECT file, checksum, checksize, checktype FROM filechecksum WHERE file IN ({','.join(map(str, file_ids))})")
                 checksums = cursor.fetchall()


Commit: d2a7f99c749f750538c04c2e5cd45e5a11f0aebb
    https://github.com/scummvm/scummvm-sites/commit/d2a7f99c749f750538c04c2e5cd45e5a11f0aebb
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-19T19:00:16+08:00

Commit Message:
INTEGRITY: Improve the connection of history search

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 5723095..573003d 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -224,15 +224,25 @@ def get_all_related_filesets(fileset_id, conn, visited=None):
     visited.add(fileset_id)
 
     related_filesets = [fileset_id]
-    with conn.cursor() as cursor:
-        cursor.execute(f"SELECT fileset, oldfileset FROM history WHERE fileset = {fileset_id} OR oldfileset = {fileset_id}")
-        history_records = cursor.fetchall()
-
-    for record in history_records:
-        if record['fileset'] not in visited:
-            related_filesets.extend(get_all_related_filesets(record['fileset'], conn, visited))
-        if record['oldfileset'] not in visited:
-            related_filesets.extend(get_all_related_filesets(record['oldfileset'], conn, visited))
+    try:
+        with conn.cursor() as cursor:
+            cursor.execute(f"SELECT fileset, oldfileset FROM history WHERE fileset = {fileset_id} OR oldfileset = {fileset_id}")
+            history_records = cursor.fetchall()
+
+        for record in history_records:
+            if record['fileset'] not in visited:
+                related_filesets.extend(get_all_related_filesets(record['fileset'], conn, visited))
+            if record['oldfileset'] not in visited:
+                related_filesets.extend(get_all_related_filesets(record['oldfileset'], conn, visited))
+    except pymysql.err.InterfaceError:
+            print("Connection lost, reconnecting...")
+            try:
+                conn = db_connect()  # Reconnect if the connection is lost
+            except Exception as e:
+                print(f"Failed to reconnect: {e}")
+                
+    except Exception as e:
+        print(f"Error fetching related filesets: {e}")
 
     return related_filesets
 


Commit: 420c9cf6fa754083643046e56689389ed4da544b
    https://github.com/scummvm/scummvm-sites/commit/420c9cf6fa754083643046e56689389ed4da544b
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-22T18:03:45+08:00

Commit Message:
INTEGRITY: Update year in README

Changed paths:
    README.md


diff --git a/README.md b/README.md
index 3fd7df8..901fc1d 100644
--- a/README.md
+++ b/README.md
@@ -1,6 +1,6 @@
-# ScummVM File Integrity Check (GSoC 2023)
+# ScummVM File Integrity Check (GSoC 2024)
 
-This repository contains the server-side code for the upcoming file integrity check for game datafiles. This repository is part of the Google Summer of Code 2023 program.
+This repository contains the server-side code for the upcoming file integrity check for game datafiles. This repository is part of the Google Summer of Code 2024 program.
 
 This website needs a `mysql_config.json` in the root to run, in the form:
 


Commit: 3f0f18d533248fbae754573ee4bedfbb40e60de1
    https://github.com/scummvm/scummvm-sites/commit/3f0f18d533248fbae754573ee4bedfbb40e60de1
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-31T21:11:26+08:00

Commit Message:
INTEGRITY: Add punycode column

Changed paths:
    schema.py


diff --git a/schema.py b/schema.py
index a94b81e..8e75085 100644
--- a/schema.py
+++ b/schema.py
@@ -168,6 +168,16 @@ try:
 except:
     # if aleady exists, change the length of the column
     cursor.execute("ALTER TABLE fileset MODIFY COLUMN `user_count` INT;")
+    
+try:
+    cursor.execute("ALTER TABLE file ADD COLUMN punycode_name VARCHAR(200);")
+except:
+    cursor.execute("ALTER TABLE file MODIFY COLUMN punycode_name VARCHAR(200);")
+    
+try:
+    cursor.execute("ALTER TABLE file ADD COLUMN encoding_type VARCHAR(20) DEFAULT 'UTF-8';")
+except:
+    cursor.execute("ALTER TABLE file MODIFY COLUMN encoding_type VARCHAR(20) DEFAULT 'UTF-8';")
 
 for index, definition in indices.items():
     try:


Commit: 066e615be5450ac161f25ec2a0758c75d6f3a992
    https://github.com/scummvm/scummvm-sites/commit/066e615be5450ac161f25ec2a0758c75d6f3a992
Author: InariInDream (inariindream at 163.com)
Date: 2024-08-31T21:11:49+08:00

Commit Message:
INTEGRITY: Add punycode_need_encode func

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index 573003d..e5eaa7d 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -157,7 +157,10 @@ def insert_file(file, detection, src, conn):
         checktype = "None"
         detection = 0
     detection_type = f"{checktype}-{checksize}" if checktype != "None" else f"{checktype}"
-    query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{detection_type}', NOW())"
+    if punycode_need_encode(escape_string(file['name'])):
+        query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{encode_punycode(escape_string(file['name']))}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{detection_type}', NOW())"
+    else:
+        query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{detection_type}', NOW())"
     with conn.cursor() as cursor:
         cursor.execute(query)
 
@@ -182,7 +185,28 @@ def delete_filesets(conn):
     query = "DELETE FROM fileset WHERE `delete` = TRUE"
     with conn.cursor() as cursor:
         cursor.execute(query)
+        
+def encode_punycode(src):
+    pass
+
+def punycode_need_encode(src):
+    if not src:
+        return False
+
+    SPECIAL_SYMBOLS = "/\":*|\\?%<>\x7f"
+
+    for char in src:
+        if ord(char) >= 0x80:
+            return True
+        if ord(char) < 0x20:
+            return True
+        if char in SPECIAL_SYMBOLS:
+            return True
+
+    if src[-1] == ' ' or src[-1] == '.':
+        return True
 
+    return False
 
 def create_log(category, user, text, conn):
     query = f"INSERT INTO log (`timestamp`, category, user, `text`) VALUES (FROM_UNIXTIME({int(time.time())}), '{escape_string(category)}', '{escape_string(user)}', '{escape_string(text)}')"


Commit: 5a09b11723fdf49cd1584fa1d628029bb184fd1f
    https://github.com/scummvm/scummvm-sites/commit/5a09b11723fdf49cd1584fa1d628029bb184fd1f
Author: InariInDream (inariindream at 163.com)
Date: 2024-09-01T23:24:04+08:00

Commit Message:
INTEGRITY: Add encode_punycode func

Changed paths:
    db_functions.py


diff --git a/db_functions.py b/db_functions.py
index e5eaa7d..4cbe1e9 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -157,8 +157,9 @@ def insert_file(file, detection, src, conn):
         checktype = "None"
         detection = 0
     detection_type = f"{checktype}-{checksize}" if checktype != "None" else f"{checktype}"
-    if punycode_need_encode(escape_string(file['name'])):
-        query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{encode_punycode(escape_string(file['name']))}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{detection_type}', NOW())"
+    if punycode_need_encode(file['name']):
+        print(encode_punycode(file['name']))
+        query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{encode_punycode(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{detection_type}', NOW())"
     else:
         query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{detection_type}', NOW())"
     with conn.cursor() as cursor:
@@ -186,26 +187,55 @@ def delete_filesets(conn):
     with conn.cursor() as cursor:
         cursor.execute(query)
         
-def encode_punycode(src):
-    pass
-
-def punycode_need_encode(src):
-    if not src:
-        return False
-
-    SPECIAL_SYMBOLS = "/\":*|\\?%<>\x7f"
-
-    for char in src:
-        if ord(char) >= 0x80:
-            return True
-        if ord(char) < 0x20:
-            return True
-        if char in SPECIAL_SYMBOLS:
-            return True
+def my_escape_string(s: str) -> str:
+    """
+    Escape strings
+
+    Escape the following:
+    - escape char: \x81
+    - unallowed filename chars: https://en.wikipedia.org/wiki/Filename#Reserved_characters_and_words
+    - control chars < 0x20
+    """
+    new_name = ""
+    for char in s:
+        if char == "\x81":
+            new_name += "\x81\x79"
+        elif char in '/":*|\\?%<>\x7f' or ord(char) < 0x20:
+            new_name += "\x81" + chr(0x80 + ord(char))
+        else:
+            new_name += char
+    return new_name
 
-    if src[-1] == ' ' or src[-1] == '.':
+        
+def encode_punycode(orig):
+    """
+    Punyencode strings
+
+    - escape special characters and
+    - ensure filenames can't end in a space or dot
+    """
+    s = my_escape_string(orig)
+    encoded = s.encode("punycode").decode("ascii")
+    # punyencoding adds an '-' at the end when there are no special chars
+    # don't use it for comparing
+    compare = encoded
+    if encoded.endswith("-"):
+        compare = encoded[:-1]
+    if orig != compare or compare[-1] in " .":
+        return "xn--" + encoded
+    return orig
+
+def punycode_need_encode(orig):
+    """
+    A filename needs to be punyencoded when it:
+
+    - contains a char that should be escaped or
+    - ends with a dot or a space.
+    """
+    if orig != escape_string(orig):
+        return True
+    if orig[-1] in " .":
         return True
-
     return False
 
 def create_log(category, user, text, conn):
@@ -753,7 +783,26 @@ def populate_file(fileset, fileset_id, conn, detection):
                 target_files_dict[target_file['id']] = f"{checksum['checktype']}-{checksum['checksize']}"
         for file in fileset['rom']:
             file_exists = False
-            cursor.execute(f"INSERT INTO file (name, size, checksum, fileset, detection, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{file['md5'] if file.get('md5') is not None else 'None'}', {fileset_id}, {0}, NOW())")
+            checksum = ""
+            checksize = 5000
+            checktype = "None"
+            if "md5" in file:
+                checksum = file["md5"]
+            else:
+                for key, value in file.items():
+                    if "md5" in key:
+                        checksize, checktype, checksum = get_checksum_props(key, value)
+                        break
+            if not detection:
+                checktype = "None"
+                detection = 0
+            detection_type = f"{checktype}-{checksize}" if checktype != "None" else f"{checktype}"
+            if punycode_need_encode(file['name']):
+                print(encode_punycode(file['name']))
+                query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{encode_punycode(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{detection_type}', NOW())"
+            else:
+                query = f"INSERT INTO file (name, size, checksum, fileset, detection, detection_type, `timestamp`) VALUES ('{escape_string(file['name'])}', '{file['size']}', '{checksum}', @fileset_last, {detection}, '{detection_type}', NOW())"
+            cursor.execute(query)
             cursor.execute("SET @file_last = LAST_INSERT_ID()")
             cursor.execute("SELECT @file_last AS file_id")
             file_id = cursor.fetchone()['file_id']


Commit: 6a4f0e68ff608569a2edbf5edd63526c3cf41416
    https://github.com/scummvm/scummvm-sites/commit/6a4f0e68ff608569a2edbf5edd63526c3cf41416
Author: InariInDream (inariindream at 163.com)
Date: 2024-09-02T00:41:07+08:00

Commit Message:
INTEGRITY: Improve the check of non-ASCII

Changed paths:
    compute_hash.py
    db_functions.py


diff --git a/compute_hash.py b/compute_hash.py
index db3b793..e5e723f 100644
--- a/compute_hash.py
+++ b/compute_hash.py
@@ -79,7 +79,7 @@ def escape_string(s: str) -> str:
     for char in s:
         if char == "\x81":
             new_name += "\x81\x79"
-        elif char in '/":*|\\?%<>\x7f' or ord(char) < 0x20:
+        elif char in '/":*|\\?%<>\x7f' or ord(char) < 0x20 or (ord(char) & 0x80):
             new_name += "\x81" + chr(0x80 + ord(char))
         else:
             new_name += char
diff --git a/db_functions.py b/db_functions.py
index 4cbe1e9..99245df 100644
--- a/db_functions.py
+++ b/db_functions.py
@@ -200,7 +200,7 @@ def my_escape_string(s: str) -> str:
     for char in s:
         if char == "\x81":
             new_name += "\x81\x79"
-        elif char in '/":*|\\?%<>\x7f' or ord(char) < 0x20:
+        elif char in '/":*|\\?%<>\x7f' or ord(char) < 0x20 or (ord(char) & 0x80):
             new_name += "\x81" + chr(0x80 + ord(char))
         else:
             new_name += char


Commit: 96f19b176bf8c29ee1e6df13e391e7bdf264726f
    https://github.com/scummvm/scummvm-sites/commit/96f19b176bf8c29ee1e6df13e391e7bdf264726f
Author: Eugene Sandulenko (sev at scummvm.org)
Date: 2024-11-07T13:38:33+01:00

Commit Message:
Merge branch 'InariInDream-integrity' into integrity

Changed paths:
  A clear.py
  A dat_parser.py
  A db_functions.py
  A fileset.py
  A megadata.py
  A pagination.py
  A schema.py
  A static/style.css
  A user_fileset_functions.py
  R bin/dat_parser.php
  R bin/schema.php
  R bin/seeds.php
  R endpoints/validate.php
  R fileset.php
  R games_list.php
  R include/db_functions.php
  R include/pagination.php
  R include/user_fileset_functions.php
  R index.php
  R logs.php
  R mod_actions.php
  R user_games_list.php
    README.md
    apache2-config/gamesdb.sev.zone.conf
    compute_hash.py






More information about the Scummvm-git-logs mailing list