-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move export cache cleaning into cron jobs #8804
base: develop
Are you sure you want to change the base?
Conversation
WalkthroughThe pull request introduces a comprehensive refactoring of the export and cache management system in the CVAT application. The changes span multiple files and focus on improving the modularity, error handling, and organization of export-related functionality. Key modifications include introducing new classes for file type management, restructuring export cache handling, adding periodic cleanup jobs for export caches, and enhancing the overall file management process across projects, tasks, and jobs. Changes
Poem
Tip CodeRabbit's docstrings feature is now available as part of our Early Access Program! Simply use the command Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
b885dac
to
f20eadf
Compare
f20eadf
to
92f9d0b
Compare
/check |
❌ Some checks failed |
14084b8
to
e4c24ea
Compare
/check |
✔️ All checks completed successfully |
Quality Gate passedIssues Measures |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (10)
cvat/apps/engine/cron.py (3)
35-55
: Validate parsed file existence before lockingThe "clear_export_cache" function relies on parsing and removing the file after acquiring a lock. Consider validating the file path and existence before attempting to parse or lock, in case there are unexpected filesystem changes. This ensures that lock acquisition and parsing overheads are avoided when the file is already removed or inaccessible.
63-71
: Optimize the model class derivation checkWhen calling "cron_export_cache_cleanup", you assert that the model is one of (Project, Task, Job). If additional models need similar cleanup in the future, you must add them here. Consider registering models dynamically or passing in the model class directly instead of a string path for improved flexibility and maintainability.
79-99
: Log suppressed exceptions to aid debuggingWithin the for-loop, all exceptions when calling "clear_export_cache" are suppressed. While it's intentional to allow other files to continue processing, consider logging suppressed exceptions to help diagnose unforeseen cleanup errors (e.g., permission issues or concurrency conflicts). This logging can be at a lower severity (debug or warning) to avoid noise.
cvat/apps/dataset_manager/views.py (2)
Line range hint
74-100
: Document the retry mechanism“retry_current_rq_job” is an important function that re-enqueues tasks when locks are unavailable. For clarity, consider adding docstrings or inline comments explaining how and why the retry logic works, including any implications for concurrency or potential for duplicate jobs.
190-190
: Examine possible repetitive retriesWhen the lock is not available, the code retries after a fixed interval. There might be a scenario where multiple tasks are perpetually enqueued if the lock is always occupied. Consider adding retry limits or more nuanced scheduling to prevent indefinite re-queuing in busy environments.
cvat/apps/dataset_manager/util.py (1)
160-175
: Consolidate filename parsing logic if possibleThe classes “_ParsedExportFilename”, “ParsedDatasetFilename”, and “ParsedBackupFilename” share considerable similarity and differ only in certain attributes. Consider unifying them or using composition to avoid repeating structural logic. This can reduce maintenance overhead if more file types are introduced.
cvat/apps/engine/models.py (1)
438-457
: Improve test coverage for filesystem operations“_FileSystemRelatedModel” introduces directory handling, temporary directory management, and export cache directory creation. Ensure these are covered by both unit and integration tests to confirm their behavior across diverse file system states and concurrency conditions.
Would you like help generating tests that mock filesystem operations?
cvat/apps/dataset_manager/tests/test_rest_api_formats.py (1)
41-41
: Testing with new “clear_export_cache” function.
Incorporating clear_export_cache from cron ensures your tests can directly verify the new cleanup functionality. Consider adding edge cases (e.g., when file is locked) to strengthen test coverage.cvat/apps/engine/background.py (2)
616-616
: Use consistent path handling methodsThe code mixes
os.path.exists
andosp.exists
. For consistency, stick to one style throughout the file.-if not os.path.exists(file_path): +if not osp.exists(file_path):Also applies to: 636-636
613-632
: Consider refactoring duplicate lock acquisition patternThe code acquires the same lock twice in succession for different operations. Consider refactoring to acquire the lock once and handle both the download and lifetime extension cases within the same lock context.
with dm.util.get_export_cache_lock( file_path, ttl=LOCK_TTL, acquire_timeout=LOCK_ACQUIRE_TIMEOUT ): if not osp.exists(file_path): return Response( "The backup file has been expired, please retry backing up", status=status.HTTP_404_NOT_FOUND, ) + if action == "download": + filename = self.export_args.filename or build_backup_file_name( + class_name=self.resource, + identifier=self.db_instance.name, + timestamp=timestamp, + extension=os.path.splitext(file_path)[1], + ) + + rq_job.delete() + return sendfile( + self.request, file_path, attachment=True, attachment_filename=filename + ) + elif not is_result_outdated(): + extend_export_file_lifetime(file_path) + return Response(status=status.HTTP_201_CREATED) - filename = self.export_args.filename or build_backup_file_name( - class_name=self.resource, - identifier=self.db_instance.name, - timestamp=timestamp, - extension=os.path.splitext(file_path)[1], - ) - - rq_job.delete() - return sendfile( - self.request, file_path, attachment=True, attachment_filename=filename - ) -with dm.util.get_export_cache_lock( - file_path, ttl=LOCK_TTL, acquire_timeout=LOCK_ACQUIRE_TIMEOUT -): - if osp.exists(file_path) and not is_result_outdated(): - extend_export_file_lifetime(file_path) - return Response(status=status.HTTP_201_CREATED)Also applies to: 633-641
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (13)
cvat/apps/dataset_manager/tests/test_rest_api_formats.py
(10 hunks)cvat/apps/dataset_manager/util.py
(5 hunks)cvat/apps/dataset_manager/views.py
(5 hunks)cvat/apps/engine/background.py
(4 hunks)cvat/apps/engine/backup.py
(8 hunks)cvat/apps/engine/cron.py
(1 hunks)cvat/apps/engine/default_settings.py
(1 hunks)cvat/apps/engine/management/commands/syncperiodicjobs.py
(1 hunks)cvat/apps/engine/migrations/0087_job_last_export_date_project_last_export_date_and_more.py
(1 hunks)cvat/apps/engine/models.py
(8 hunks)cvat/apps/engine/tests/test_rest_api.py
(1 hunks)cvat/settings/base.py
(1 hunks)dev/format_python_code.sh
(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- cvat/apps/engine/tests/test_rest_api.py
🔇 Additional comments (19)
cvat/apps/dataset_manager/views.py (1)
134-134
: Consider handling None cases during export timestamp updates
If “db_instance” has never been exported, “touch_last_export_date” sets “last_export_date” for the first time. This is valid, but ensure there’s no scenario where “last_export_date” was intentionally left unset (for instance, in partial or failed exports). A quick check or comment clarifying usage might help with maintainability.
cvat/apps/dataset_manager/util.py (1)
154-157
: Validate usage of ExportFileType across codebase
“ExportFileType” defines distinct file types. Ensure that references to “dataset” vs. “annotations” usage are consistent throughout the code and aligned with how files are actually generated. This helps avoid confusion in log outputs, debugging, or user-facing documentation.
cvat/apps/engine/models.py (1)
435-437
: Revisit potential complexity with multiple metaclasses
The “ABCModelMeta” merges “ABCMeta” and “ModelBase”. While it’s a neat approach, keep in mind Django’s future updates and potential conflicts with ABC usage. Thorough testing is recommended, especially for migrations and introspection capabilities.
cvat/apps/engine/backup.py (7)
Line range hint 316-347
: Good use of an abstract base class for exporters.
This design enforces a consistent interface and helps ensure that each subclass implements a standardized export_to method.
534-534
: Appropriate override of the abstract method.
The overridden export_to method in TaskExporter properly adheres to the base class interface.
1029-1039
: Placement of db_instance timestamp updates is correct.
The call to touch_last_export_date followed by refresh_from_db ensures that the updated timestamp is persisted and correct before the export operation continues. This ordering also prevents inconsistencies related to the instance’s last update data.
1044-1054
: Concurrency handling by acquiring the export cache lock.
Acquiring the lock and checking if the file path already exists helps avoid redundant re-export operations. This design effectively reduces race conditions around reading or writing export files.
1055-1059
: Efficient use of a temporary file for intermediate export.
Writing to a temporary location and then moving it into the final path is a good practice to avoid partial or corrupted files in case of unexpected interruptions.
1060-1071
: Double-lock mechanism for the final file write.
Locking again (on the same file path) before the file replacement is a sound practice to ensure exclusively safe updates. The logging statement also provides a clear audit trail.
1074-1083
: Graceful handling of a LockNotAvailableError.
Retrying via retry_current_rq_job is a succinct and effective way to manage concurrency collisions in export jobs.
cvat/apps/dataset_manager/tests/test_rest_api_formats.py (1)
38-38
: Refined import of “export” function.
Importing “export” explicitly from the dataset_manager module clarifies where the export functionality is sourced from.
cvat/apps/engine/migrations/0087_job_last_export_date_project_last_export_date_and_more.py (1)
1-28
: Migration for new last_export_date fields.
These additions look standard and properly set null=True for backward compatibility. Ensure any newly introduced code does not inadvertently require a default value.
dev/format_python_code.sh (1)
30-30
: Including cron.py in formatting targets.
Adding the new cron.py file to the formatting script is crucial to maintain consistent code style across the repository.
cvat/apps/engine/management/commands/syncperiodicjobs.py (1)
74-75
: LGTM: Safe and backward-compatible enhancement to job scheduling
The addition of optional args
and kwargs
parameters using .get()
is well-implemented and maintains compatibility with existing jobs while enabling more flexible job configurations.
cvat/apps/engine/default_settings.py (1)
94-95
: LGTM: Well-defined configuration constant
The addition of EXPORT_CACHE_DIR_NAME
as a constant improves maintainability by centralizing the export cache directory configuration.
cvat/settings/base.py (1)
356-369
: LGTM: Well-structured periodic jobs implementation
The implementation effectively distributes cache cleanup jobs across different times to spread the load. The use of a generator expression keeps the code DRY and maintainable.
Let's verify the existence of the cron function:
✅ Verification successful
Function exists and is properly implemented
The cron function cron_export_cache_cleanup
is correctly implemented in cvat/apps/engine/cron.py
and is already being used in tests, confirming its functionality. The periodic jobs implementation is correctly referencing the existing function.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify the cron function exists in the codebase
# Search for the cron function definition
ast-grep --pattern 'def cron_export_cache_cleanup'
Length of output: 52
Script:
#!/bin/bash
# Let's try a broader search using ripgrep to find any mentions of the function
rg "cron_export_cache_cleanup" -A 3
Length of output: 1429
cvat/apps/engine/background.py (3)
26-26
: LGTM: Clean import addition
The new import for extend_export_file_lifetime
is properly placed and consistently used across the export managers.
334-334
: LGTM: Proper implementation of file lifetime extension
The implementation correctly extends the file lifetime within a lock context, ensuring thread safety. The check for file existence and outdated status before extension prevents unnecessary operations.
550-551
: LGTM: Clean helper function implementation
The is_result_outdated
helper function is well-implemented and improves code readability.
def make_dataset_file_path( | ||
cls, | ||
cache_dir: str, | ||
*, | ||
save_images: bool, | ||
instance_timestamp: float, | ||
format_name: str, | ||
) -> str: | ||
from .formats.registry import EXPORT_FORMATS | ||
|
||
file_ext = EXPORT_FORMATS[format_name].EXT | ||
|
||
file_type = ExportFileType.DATASET if save_images else ExportFileType.ANNOTATIONS | ||
|
||
normalized_format_name = make_file_name(to_snake_case(format_name)) | ||
filename = cls.BASE_FILE_NAME_TEMPLATE.format_map( | ||
{ | ||
"file_type": file_type, | ||
"instance_timestamp": instance_timestamp, | ||
"optional_suffix": "-" + normalized_format_name, | ||
"file_ext": file_ext, | ||
} | ||
) | ||
|
||
def parse_export_file_path(file_path: os.PathLike[str]) -> ParsedExportFilename: | ||
file_path = osp.normpath(file_path) | ||
dirname, basename = osp.split(file_path) | ||
return osp.join(cache_dir, filename) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Provide fallback or raise error when format_name is missing
“make_dataset_file_path” fetches the file extension from “EXPORT_FORMATS[format_name].EXT”. If “format_name” is missing or not found, this raises a KeyError. Consider adding a fallback to a default extension or raising a more descriptive error to guide the user in diagnosing invalid formats.
class _Exportable(models.Model): | ||
class Meta: | ||
abstract = True | ||
|
||
last_export_date = models.DateTimeField(null=True) | ||
|
||
def touch_last_export_date(self): | ||
self.last_export_date = timezone.now() | ||
self.save(update_fields=["last_export_date"]) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
last_export_date logic and concurrency considerations
“_Exportable” manages “last_export_date”. If two exports occur concurrently on the same entity, they might race to set the timestamp. Consider whether transactional locking or a re-check is necessary in high-volume usage scenarios. Otherwise, the “last_export_date” might be out of sync with actual export times.
'queue': CVAT_QUEUES.CLEANING.value, | ||
'id': f'cron_{model.lower()}_export_cache_cleanup', | ||
'func': 'cvat.apps.engine.cron.cron_export_cache_cleanup', | ||
# Run once a day at midnight |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is only applicable to projects. You've configured the other jobs to run at 6:00 and noon.
That said, what's the purpose of staggering the jobs like that? Why not just have one job that runs at midnight and cleans up everything?
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #8804 +/- ##
===========================================
+ Coverage 73.88% 73.99% +0.10%
===========================================
Files 408 409 +1
Lines 44086 44170 +84
Branches 3986 3986
===========================================
+ Hits 32575 32685 +110
+ Misses 11511 11485 -26
|
Motivation and context
Depends on #8721
How has this been tested?
Checklist
develop
branch(cvat-canvas,
cvat-core,
cvat-data and
cvat-ui)
License
Feel free to contact the maintainers if that's a concern.
Summary by CodeRabbit
New Features
Bug Fixes
Refactor
Documentation
Chores