diff --git a/CHANGELOG.md b/CHANGELOG.md index e8da8854734..ec880b9c61e 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,11 +1,17 @@ # Changelog -## [3.12.2-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD) +## [3.12.2-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD) [Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.12.1...HEAD) +### 📖 Documentation + +- Update website with more studios [\#3554](https://github.com/pypeclub/OpenPype/pull/3554) +- Documentation: Update publishing dev docs [\#3549](https://github.com/pypeclub/OpenPype/pull/3549) + **🚀 Enhancements** +- Maya: add additional validators to Settings [\#3540](https://github.com/pypeclub/OpenPype/pull/3540) - General: Interactive console in cli [\#3526](https://github.com/pypeclub/OpenPype/pull/3526) - Ftrack: Automatic daily review session creation can define trigger hour [\#3516](https://github.com/pypeclub/OpenPype/pull/3516) - Ftrack: add source into Note [\#3509](https://github.com/pypeclub/OpenPype/pull/3509) @@ -20,8 +26,15 @@ **🐛 Bug fixes** +- Remove invalid submodules from `/vendor` [\#3557](https://github.com/pypeclub/OpenPype/pull/3557) +- General: Remove hosts filter on integrator plugins [\#3556](https://github.com/pypeclub/OpenPype/pull/3556) +- Settings: Clean default values of environments [\#3550](https://github.com/pypeclub/OpenPype/pull/3550) +- Module interfaces: Fix import error [\#3547](https://github.com/pypeclub/OpenPype/pull/3547) +- Workfiles tool: Show of tool and it's flags [\#3539](https://github.com/pypeclub/OpenPype/pull/3539) +- General: Create workfile documents works again [\#3538](https://github.com/pypeclub/OpenPype/pull/3538) - Additional fixes for powershell scripts [\#3525](https://github.com/pypeclub/OpenPype/pull/3525) - Maya: Added wrapper around cmds.setAttr [\#3523](https://github.com/pypeclub/OpenPype/pull/3523) +- Nuke: double slate [\#3521](https://github.com/pypeclub/OpenPype/pull/3521) - General: Fix hash of centos oiio archive [\#3519](https://github.com/pypeclub/OpenPype/pull/3519) - Maya: Renderman display output fix [\#3514](https://github.com/pypeclub/OpenPype/pull/3514) - TrayPublisher: Simple creation enhancements and fixes [\#3513](https://github.com/pypeclub/OpenPype/pull/3513) @@ -31,8 +44,12 @@ **🔀 Refactored code** +- Refactor Integrate Asset [\#3530](https://github.com/pypeclub/OpenPype/pull/3530) - General: Client docstrings cleanup [\#3529](https://github.com/pypeclub/OpenPype/pull/3529) +- General: Get current context document functions [\#3522](https://github.com/pypeclub/OpenPype/pull/3522) +- Kitsu: Use query function from client [\#3496](https://github.com/pypeclub/OpenPype/pull/3496) - TimersManager: Use query functions [\#3495](https://github.com/pypeclub/OpenPype/pull/3495) +- Deadline: Use query functions [\#3466](https://github.com/pypeclub/OpenPype/pull/3466) ## [3.12.1](https://github.com/pypeclub/OpenPype/tree/3.12.1) (2022-07-13) @@ -57,7 +74,6 @@ - Windows installer: Clean old files and add version subfolder [\#3445](https://github.com/pypeclub/OpenPype/pull/3445) - Blender: Bugfix - Set fps properly on open [\#3426](https://github.com/pypeclub/OpenPype/pull/3426) - Hiero: Add custom scripts menu [\#3425](https://github.com/pypeclub/OpenPype/pull/3425) -- Blender: pre pyside install for all platforms [\#3400](https://github.com/pypeclub/OpenPype/pull/3400) **🐛 Bug fixes** @@ -95,34 +111,19 @@ [Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.0-nightly.3...3.12.0) -### 📖 Documentation - -- Fix typo in documentation: pyenv on mac [\#3417](https://github.com/pypeclub/OpenPype/pull/3417) -- Linux: update OIIO package [\#3401](https://github.com/pypeclub/OpenPype/pull/3401) - **🚀 Enhancements** - Webserver: Added CORS middleware [\#3422](https://github.com/pypeclub/OpenPype/pull/3422) -- Attribute Defs UI: Files widget show what is allowed to drop in [\#3411](https://github.com/pypeclub/OpenPype/pull/3411) **🐛 Bug fixes** - NewPublisher: Fix subset name change on change of creator plugin [\#3420](https://github.com/pypeclub/OpenPype/pull/3420) - Bug: fix invalid avalon import [\#3418](https://github.com/pypeclub/OpenPype/pull/3418) -- Nuke: Fix keyword argument in query function [\#3414](https://github.com/pypeclub/OpenPype/pull/3414) -- Houdini: fix loading and updating vbd/bgeo sequences [\#3408](https://github.com/pypeclub/OpenPype/pull/3408) -- Nuke: Collect representation files based on Write [\#3407](https://github.com/pypeclub/OpenPype/pull/3407) -- General: Filter representations before integration start [\#3398](https://github.com/pypeclub/OpenPype/pull/3398) -- Maya: look collector typo [\#3392](https://github.com/pypeclub/OpenPype/pull/3392) **🔀 Refactored code** - Unreal: Use client query functions [\#3421](https://github.com/pypeclub/OpenPype/pull/3421) - General: Move editorial lib to pipeline [\#3419](https://github.com/pypeclub/OpenPype/pull/3419) -- Kitsu: renaming to plural func sync\_all\_projects [\#3397](https://github.com/pypeclub/OpenPype/pull/3397) -- Houdini: Use client query functions [\#3395](https://github.com/pypeclub/OpenPype/pull/3395) -- Hiero: Use client query functions [\#3393](https://github.com/pypeclub/OpenPype/pull/3393) -- Nuke: Use client query functions [\#3391](https://github.com/pypeclub/OpenPype/pull/3391) ## [3.11.1](https://github.com/pypeclub/OpenPype/tree/3.11.1) (2022-06-20) diff --git a/igniter/bootstrap_repos.py b/igniter/bootstrap_repos.py index 08333885c0b..47f25259526 100644 --- a/igniter/bootstrap_repos.py +++ b/igniter/bootstrap_repos.py @@ -122,7 +122,7 @@ def __init__(self, *args, **kwargs): if self.staging: if kwargs.get("build"): if "staging" not in kwargs.get("build"): - kwargs["build"] = "{}-staging".format(kwargs.get("build")) + kwargs["build"] = f"{kwargs.get('build')}-staging" else: kwargs["build"] = "staging" @@ -136,8 +136,7 @@ def __eq__(self, other): return bool(result and self.staging == other.staging) def __repr__(self): - return "<{}: {} - path={}>".format( - self.__class__.__name__, str(self), self.path) + return f"<{self.__class__.__name__}: {str(self)} - path={self.path}>" def __lt__(self, other: OpenPypeVersion): result = super().__lt__(other) @@ -232,10 +231,7 @@ def parse(cls, version): return openpype_version def __hash__(self): - if self.path: - return hash(self.path) - else: - return hash(str(self)) + return hash(self.path) if self.path else hash(str(self)) @staticmethod def is_version_in_dir( @@ -384,7 +380,8 @@ def openpype_path_is_accessible(cls): @classmethod def get_local_versions( - cls, production: bool = None, staging: bool = None + cls, production: bool = None, + staging: bool = None, compatible_with: OpenPypeVersion = None ) -> List: """Get all versions available on this machine. @@ -394,6 +391,8 @@ def get_local_versions( Args: production (bool): Return production versions. staging (bool): Return staging versions. + compatible_with (OpenPypeVersion): Return only those compatible + with specified version. """ # Return all local versions if arguments are set to None if production is None and staging is None: @@ -410,10 +409,19 @@ def get_local_versions( if not production and not staging: return [] + # DEPRECATED: backwards compatible way to look for versions in root dir_to_search = Path(user_data_dir("openpype", "pypeclub")) versions = OpenPypeVersion.get_versions_from_directory( - dir_to_search + dir_to_search, compatible_with=compatible_with ) + if compatible_with: + dir_to_search = Path( + user_data_dir("openpype", "pypeclub")) / f"{compatible_with.major}.{compatible_with.minor}" # noqa + versions += OpenPypeVersion.get_versions_from_directory( + dir_to_search, compatible_with=compatible_with + ) + + filtered_versions = [] for version in versions: if version.is_staging(): @@ -425,7 +433,8 @@ def get_local_versions( @classmethod def get_remote_versions( - cls, production: bool = None, staging: bool = None + cls, production: bool = None, + staging: bool = None, compatible_with: OpenPypeVersion = None ) -> List: """Get all versions available in OpenPype Path. @@ -435,6 +444,8 @@ def get_remote_versions( Args: production (bool): Return production versions. staging (bool): Return staging versions. + compatible_with (OpenPypeVersion): Return only those compatible + with specified version. """ # Return all local versions if arguments are set to None if production is None and staging is None: @@ -468,7 +479,14 @@ def get_remote_versions( if not dir_to_search: return [] - versions = cls.get_versions_from_directory(dir_to_search) + # DEPRECATED: look for version in root directory + versions = cls.get_versions_from_directory( + dir_to_search, compatible_with=compatible_with) + if compatible_with: + dir_to_search = dir_to_search / f"{compatible_with.major}.{compatible_with.minor}" # noqa + versions += cls.get_versions_from_directory( + dir_to_search, compatible_with=compatible_with) + filtered_versions = [] for version in versions: if version.is_staging(): @@ -479,11 +497,15 @@ def get_remote_versions( return list(sorted(set(filtered_versions))) @staticmethod - def get_versions_from_directory(openpype_dir: Path) -> List: + def get_versions_from_directory( + openpype_dir: Path, + compatible_with: OpenPypeVersion = None) -> List: """Get all detected OpenPype versions in directory. Args: openpype_dir (Path): Directory to scan. + compatible_with (OpenPypeVersion): Return only versions compatible + with build version specified as OpenPypeVersion. Returns: list of OpenPypeVersion @@ -518,6 +540,10 @@ def get_versions_from_directory(openpype_dir: Path) -> List: )[0]: continue + if compatible_with and not detected_version.is_compatible( + compatible_with): + continue + detected_version.path = item _openpype_versions.append(detected_version) @@ -549,8 +575,9 @@ def get_installed_version(cls): def get_latest_version( staging: bool = False, local: bool = None, - remote: bool = None - ) -> OpenPypeVersion: + remote: bool = None, + compatible_with: OpenPypeVersion = None + ) -> Union[OpenPypeVersion, None]: """Get latest available version. The version does not contain information about path and source. @@ -568,6 +595,9 @@ def get_latest_version( staging (bool, optional): List staging versions if True. local (bool, optional): List local versions if True. remote (bool, optional): List remote versions if True. + compatible_with (OpenPypeVersion, optional) Return only version + compatible with compatible_with. + """ if local is None and remote is None: local = True @@ -598,7 +628,12 @@ def get_latest_version( return None all_versions.sort() - return all_versions[-1] + latest_version: OpenPypeVersion + latest_version = all_versions[-1] + if compatible_with and not latest_version.is_compatible( + compatible_with): + return None + return latest_version @classmethod def get_expected_studio_version(cls, staging=False, global_settings=None): @@ -621,6 +656,21 @@ def get_expected_studio_version(cls, staging=False, global_settings=None): return None return OpenPypeVersion(version=result) + def is_compatible(self, version: OpenPypeVersion): + """Test build compatibility. + + This will simply compare major and minor versions (ignoring patch + and the rest). + + Args: + version (OpenPypeVersion): Version to check compatibility with. + + Returns: + bool: if the version is compatible + + """ + return self.major == version.major and self.minor == version.minor + class BootstrapRepos: """Class for bootstrapping local OpenPype installation. @@ -741,8 +791,9 @@ def create_version_from_live_code( return # create destination directory - if not self.data_dir.exists(): - self.data_dir.mkdir(parents=True) + destination = self.data_dir / f"{installed_version.major}.{installed_version.minor}" # noqa + if not destination.exists(): + destination.mkdir(parents=True) # create zip inside temporary directory. with tempfile.TemporaryDirectory() as temp_dir: @@ -770,7 +821,9 @@ def _move_zip_to_data_dir(self, zip_file) -> Union[None, Path]: Path to moved zip on success. """ - destination = self.data_dir / zip_file.name + version = OpenPypeVersion.version_in_str(zip_file.name) + destination_dir = self.data_dir / f"{version.major}.{version.minor}" + destination = destination_dir / zip_file.name if destination.exists(): self._print( @@ -782,7 +835,7 @@ def _move_zip_to_data_dir(self, zip_file) -> Union[None, Path]: self._print(str(e), LOG_ERROR, exc_info=True) return None try: - shutil.move(zip_file.as_posix(), self.data_dir.as_posix()) + shutil.move(zip_file.as_posix(), destination_dir.as_posix()) except shutil.Error as e: self._print(str(e), LOG_ERROR, exc_info=True) return None @@ -995,6 +1048,16 @@ def _validate_zip(path: Path) -> tuple: @staticmethod def _validate_dir(path: Path) -> tuple: + """Validate checksums in a given path. + + Args: + path (Path): path to folder to validate. + + Returns: + tuple(bool, str): returns status and reason as a bool + and str in a tuple. + + """ checksums_file = Path(path / "checksums") if not checksums_file.exists(): # FIXME: This should be set to False sometimes in the future @@ -1076,7 +1139,20 @@ def add_paths_from_directory(directory: Path) -> None: sys.path.insert(0, directory.as_posix()) @staticmethod - def find_openpype_version(version, staging): + def find_openpype_version( + version: Union[str, OpenPypeVersion], + staging: bool, + compatible_with: OpenPypeVersion = None + ) -> Union[OpenPypeVersion, None]: + """Find location of specified OpenPype version. + + Args: + version (Union[str, OpenPypeVersion): Version to find. + staging (bool): Filter staging versions. + compatible_with (OpenPypeVersion, optional): Find only + versions compatible with specified one. + + """ if isinstance(version, str): version = OpenPypeVersion(version=version) @@ -1085,7 +1161,8 @@ def find_openpype_version(version, staging): return installed_version local_versions = OpenPypeVersion.get_local_versions( - staging=staging, production=not staging + staging=staging, production=not staging, + compatible_with=compatible_with ) zip_version = None for local_version in local_versions: @@ -1099,7 +1176,8 @@ def find_openpype_version(version, staging): return zip_version remote_versions = OpenPypeVersion.get_remote_versions( - staging=staging, production=not staging + staging=staging, production=not staging, + compatible_with=compatible_with ) for remote_version in remote_versions: if remote_version == version: @@ -1107,13 +1185,14 @@ def find_openpype_version(version, staging): return None @staticmethod - def find_latest_openpype_version(staging): + def find_latest_openpype_version( + staging, compatible_with: OpenPypeVersion = None): installed_version = OpenPypeVersion.get_installed_version() local_versions = OpenPypeVersion.get_local_versions( - staging=staging + staging=staging, compatible_with=compatible_with ) remote_versions = OpenPypeVersion.get_remote_versions( - staging=staging + staging=staging, compatible_with=compatible_with ) all_versions = local_versions + remote_versions if not staging: @@ -1138,7 +1217,9 @@ def find_openpype( self, openpype_path: Union[Path, str] = None, staging: bool = False, - include_zips: bool = False) -> Union[List[OpenPypeVersion], None]: + include_zips: bool = False, + compatible_with: OpenPypeVersion = None + ) -> Union[List[OpenPypeVersion], None]: """Get ordered dict of detected OpenPype version. Resolution order for OpenPype is following: @@ -1154,6 +1235,8 @@ def find_openpype( otherwise. include_zips (bool, optional): If set True it will try to find OpenPype in zip files in given directory. + compatible_with (OpenPypeVersion, optional): Find only those + versions compatible with the one specified. Returns: dict of Path: Dictionary of detected OpenPype version. @@ -1172,30 +1255,56 @@ def find_openpype( ("Finding OpenPype in non-filesystem locations is" " not implemented yet.")) - dir_to_search = self.data_dir - user_versions = self.get_openpype_versions(self.data_dir, staging) - # if we have openpype_path specified, search only there. + version_dir = "" + if compatible_with: + version_dir = f"{compatible_with.major}.{compatible_with.minor}" + + # if checks bellow for OPENPYPE_PATH and registry fails, use data_dir + # DEPRECATED: lookup in root of this folder is deprecated in favour + # of major.minor sub-folders. + dirs_to_search = [ + self.data_dir + ] + if compatible_with: + dirs_to_search.append(self.data_dir / version_dir) + if openpype_path: - dir_to_search = openpype_path + dirs_to_search = [openpype_path] + + if compatible_with: + dirs_to_search.append(openpype_path / version_dir) else: - if os.getenv("OPENPYPE_PATH"): - if Path(os.getenv("OPENPYPE_PATH")).exists(): - dir_to_search = Path(os.getenv("OPENPYPE_PATH")) + # first try OPENPYPE_PATH and if that is not available, + # try registry. + if os.getenv("OPENPYPE_PATH") \ + and Path(os.getenv("OPENPYPE_PATH")).exists(): + dirs_to_search = [Path(os.getenv("OPENPYPE_PATH"))] + + if compatible_with: + dirs_to_search.append( + Path(os.getenv("OPENPYPE_PATH")) / version_dir) else: try: registry_dir = Path( str(self.registry.get_item("openPypePath"))) if registry_dir.exists(): - dir_to_search = registry_dir + dirs_to_search = [registry_dir] + if compatible_with: + dirs_to_search.append(registry_dir / version_dir) except ValueError: # nothing found in registry, we'll use data dir pass - openpype_versions = self.get_openpype_versions(dir_to_search, staging) - openpype_versions += user_versions + openpype_versions = [] + for dir_to_search in dirs_to_search: + try: + openpype_versions += self.get_openpype_versions( + dir_to_search, staging, compatible_with=compatible_with) + except ValueError: + # location is invalid, skip it + pass - # remove zip file version if needed. if not include_zips: openpype_versions = [ v for v in openpype_versions if v.path.suffix != ".zip" @@ -1308,9 +1417,8 @@ def extract_openpype(self, version: OpenPypeVersion) -> Union[Path, None]: raise ValueError( f"version {version} is not associated with any file") - destination = self.data_dir / version.path.stem - if destination.exists(): - assert destination.is_dir() + destination = self.data_dir / f"{version.major}.{version.minor}" / version.path.stem # noqa + if destination.exists() and destination.is_dir(): try: shutil.rmtree(destination) except OSError as e: @@ -1379,7 +1487,7 @@ def install_version(self, else: dir_name = openpype_version.path.stem - destination = self.data_dir / dir_name + destination = self.data_dir / f"{openpype_version.major}.{openpype_version.minor}" / dir_name # noqa # test if destination directory already exist, if so lets delete it. if destination.exists() and force: @@ -1557,14 +1665,18 @@ def _is_openpype_in_zip(self, return False return True - def get_openpype_versions(self, - openpype_dir: Path, - staging: bool = False) -> list: + def get_openpype_versions( + self, + openpype_dir: Path, + staging: bool = False, + compatible_with: OpenPypeVersion = None) -> list: """Get all detected OpenPype versions in directory. Args: openpype_dir (Path): Directory to scan. staging (bool, optional): Find staging versions if True. + compatible_with (OpenPypeVersion, optional): Get only versions + compatible with the one specified. Returns: list of OpenPypeVersion @@ -1574,7 +1686,7 @@ def get_openpype_versions(self, """ if not openpype_dir.exists() and not openpype_dir.is_dir(): - raise ValueError("specified directory is invalid") + raise ValueError(f"specified directory {openpype_dir} is invalid") _openpype_versions = [] # iterate over directory in first level and find all that might @@ -1599,6 +1711,10 @@ def get_openpype_versions(self, ): continue + if compatible_with and \ + not detected_version.is_compatible(compatible_with): + continue + detected_version.path = item if staging and detected_version.is_staging(): _openpype_versions.append(detected_version) diff --git a/igniter/tools.py b/igniter/tools.py index 57159b5e526..a9d592acf03 100644 --- a/igniter/tools.py +++ b/igniter/tools.py @@ -21,6 +21,11 @@ class OpenPypeVersionNotFound(Exception): pass +class OpenPypeVersionIncompatible(Exception): + """OpenPype version is not compatible with the installed one (build).""" + pass + + def should_add_certificate_path_to_mongo_url(mongo_url): """Check if should add ca certificate to mongo url. diff --git a/openpype/cli.py b/openpype/cli.py index 9a2dfaa141a..ffe288040e1 100644 --- a/openpype/cli.py +++ b/openpype/cli.py @@ -443,3 +443,26 @@ def interactive(): __version__, sys.version, sys.platform ) code.interact(banner) + + +@main.command() +@click.option("--build", help="Print only build version", + is_flag=True, default=False) +def version(build): + """Print OpenPype version.""" + + from openpype.version import __version__ + from igniter.bootstrap_repos import BootstrapRepos, OpenPypeVersion + from pathlib import Path + import os + + if getattr(sys, 'frozen', False): + local_version = BootstrapRepos.get_version( + Path(os.getenv("OPENPYPE_ROOT"))) + else: + local_version = OpenPypeVersion.get_installed_version_str() + + if build: + print(local_version) + return + print(f"{__version__} (booted: {local_version})") diff --git a/openpype/hosts/harmony/api/pipeline.py b/openpype/hosts/harmony/api/pipeline.py index 86b5753f7e1..94ca134205e 100644 --- a/openpype/hosts/harmony/api/pipeline.py +++ b/openpype/hosts/harmony/api/pipeline.py @@ -15,6 +15,7 @@ deregister_creator_plugin_path, AVALON_CONTAINER_ID, ) +from openpype.pipeline.context_tools import get_current_project_asset import openpype.hosts.harmony import openpype.hosts.harmony.api as harmony @@ -50,7 +51,9 @@ def get_asset_settings(): dict: Scene data. """ - asset_data = lib.get_asset()["data"] + + asset_doc = get_current_project_asset() + asset_data = asset_doc["data"] fps = asset_data.get("fps") frame_start = asset_data.get("frameStart") frame_end = asset_data.get("frameEnd") diff --git a/openpype/hosts/harmony/plugins/publish/validate_scene_settings.py b/openpype/hosts/harmony/plugins/publish/validate_scene_settings.py index 4c3a6c44653..936533abd6b 100644 --- a/openpype/hosts/harmony/plugins/publish/validate_scene_settings.py +++ b/openpype/hosts/harmony/plugins/publish/validate_scene_settings.py @@ -55,6 +55,10 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin): def process(self, instance): """Plugin entry point.""" + + # TODO 'get_asset_settings' could expect asset document as argument + # which is available on 'context.data["assetEntity"]' + # - the same approach can be used in 'ValidateSceneSettingsRepair' expected_settings = harmony.get_asset_settings() self.log.info("scene settings from DB:".format(expected_settings)) diff --git a/openpype/hosts/hiero/api/plugin.py b/openpype/hosts/hiero/api/plugin.py index add416d04e7..28a9dfb492d 100644 --- a/openpype/hosts/hiero/api/plugin.py +++ b/openpype/hosts/hiero/api/plugin.py @@ -10,6 +10,7 @@ import openpype.api as openpype from openpype.pipeline import LoaderPlugin, LegacyCreator +from openpype.pipeline.context_tools import get_current_project_asset from . import lib log = openpype.Logger().get_logger(__name__) @@ -484,7 +485,7 @@ def _get_asset_data(self): """ asset_name = self.context["representation"]["context"]["asset"] - asset_doc = openpype.get_asset(asset_name) + asset_doc = get_current_project_asset(asset_name) log.debug("__ asset_doc: {}".format(pformat(asset_doc))) self.data["assetData"] = asset_doc["data"] diff --git a/openpype/hosts/houdini/api/lib.py b/openpype/hosts/houdini/api/lib.py index dd8a5ba4732..c8a7f92bb99 100644 --- a/openpype/hosts/houdini/api/lib.py +++ b/openpype/hosts/houdini/api/lib.py @@ -5,8 +5,8 @@ import six from openpype.client import get_asset_by_name -from openpype.api import get_asset from openpype.pipeline import legacy_io +from openpype.pipeline.context_tools import get_current_project_asset import hou @@ -16,7 +16,7 @@ def get_asset_fps(): """Return current asset fps.""" - return get_asset()["data"].get("fps") + return get_current_project_asset()["data"].get("fps") def set_id(node, unique_id, overwrite=False): diff --git a/openpype/hosts/maya/api/lib.py b/openpype/hosts/maya/api/lib.py index e4221978c0f..58e160cb2f0 100644 --- a/openpype/hosts/maya/api/lib.py +++ b/openpype/hosts/maya/api/lib.py @@ -23,7 +23,6 @@ get_last_versions, get_representation_by_name ) -from openpype import lib from openpype.api import get_anatomy_settings from openpype.pipeline import ( legacy_io, @@ -33,6 +32,7 @@ load_container, registered_host, ) +from openpype.pipeline.context_tools import get_current_project_asset from .commands import reset_frame_range @@ -2174,7 +2174,7 @@ def reset_scene_resolution(): project_name = legacy_io.active_project() project_doc = get_project(project_name) project_data = project_doc["data"] - asset_data = lib.get_asset()["data"] + asset_data = get_current_project_asset()["data"] # Set project resolution width_key = "resolutionWidth" @@ -2208,7 +2208,8 @@ def set_context_settings(): project_name = legacy_io.active_project() project_doc = get_project(project_name) project_data = project_doc["data"] - asset_data = lib.get_asset()["data"] + asset_doc = get_current_project_asset(fields=["data.fps"]) + asset_data = asset_doc.get("data", {}) # Set project fps fps = asset_data.get("fps", project_data.get("fps", 25)) @@ -2233,7 +2234,7 @@ def validate_fps(): """ - fps = lib.get_asset()["data"]["fps"] + fps = get_current_project_asset(fields=["data.fps"])["data"]["fps"] # TODO(antirotor): This is hack as for framerates having multiple # decimal places. FTrack is ceiling decimal values on # fps to two decimal places but Maya 2019+ is reporting those fps @@ -3051,8 +3052,9 @@ def update_content_on_context_change(): This will update scene content to match new asset on context change """ scene_sets = cmds.listSets(allSets=True) - new_asset = legacy_io.Session["AVALON_ASSET"] - new_data = lib.get_asset()["data"] + asset_doc = get_current_project_asset() + new_asset = asset_doc["name"] + new_data = asset_doc["data"] for s in scene_sets: try: if cmds.getAttr("{}.id".format(s)) == "pyblish.avalon.instance": diff --git a/openpype/hosts/maya/plugins/create/create_render.py b/openpype/hosts/maya/plugins/create/create_render.py index 93ee6679e57..de07a0b23d8 100644 --- a/openpype/hosts/maya/plugins/create/create_render.py +++ b/openpype/hosts/maya/plugins/create/create_render.py @@ -15,13 +15,13 @@ from openpype.lib import requests_get from openpype.api import ( get_system_settings, - get_project_settings, - get_asset) + get_project_settings) from openpype.modules import ModulesManager from openpype.pipeline import ( CreatorError, legacy_io, ) +from openpype.pipeline.context_tools import get_current_project_asset class CreateRender(plugin.Creator): @@ -413,7 +413,7 @@ def _set_default_renderer_settings(self, renderer): prefix, type="string") - asset = get_asset() + asset = get_current_project_asset() if renderer == "arnold": # set format to exr diff --git a/openpype/hosts/maya/plugins/publish/validate_maya_units.py b/openpype/hosts/maya/plugins/publish/validate_maya_units.py index d5a8c350d5c..5f67adec76d 100644 --- a/openpype/hosts/maya/plugins/publish/validate_maya_units.py +++ b/openpype/hosts/maya/plugins/publish/validate_maya_units.py @@ -2,8 +2,8 @@ import pyblish.api import openpype.api -from openpype import lib import openpype.hosts.maya.api.lib as mayalib +from openpype.pipeline.context_tools import get_current_project_asset from math import ceil @@ -41,7 +41,9 @@ def process(self, context): # now flooring the value? fps = float_round(context.data.get('fps'), 2, ceil) - asset_fps = lib.get_asset()["data"]["fps"] + # TODO repace query with using 'context.data["assetEntity"]' + asset_doc = get_current_project_asset() + asset_fps = asset_doc["data"]["fps"] self.log.info('Units (linear): {0}'.format(linearunits)) self.log.info('Units (angular): {0}'.format(angularunits)) @@ -91,5 +93,7 @@ def repair(cls, context): cls.log.debug(current_linear) cls.log.info("Setting time unit to match project") - asset_fps = lib.get_asset()["data"]["fps"] + # TODO repace query with using 'context.data["assetEntity"]' + asset_doc = get_current_project_asset() + asset_fps = asset_doc["data"]["fps"] mayalib.set_scene_fps(asset_fps) diff --git a/openpype/hosts/nuke/api/lib.py b/openpype/hosts/nuke/api/lib.py index 9b24c9fb382..74db164ae5e 100644 --- a/openpype/hosts/nuke/api/lib.py +++ b/openpype/hosts/nuke/api/lib.py @@ -24,7 +24,6 @@ BuildWorkfile, get_version_from_path, get_workdir_data, - get_asset, get_current_project_settings, ) from openpype.tools.utils import host_tools @@ -40,6 +39,7 @@ legacy_io, Anatomy, ) +from openpype.pipeline.context_tools import get_current_project_asset from . import gizmo_menu @@ -1766,7 +1766,7 @@ def __init__(self, root_node=None, nodes=None, **kwargs): kwargs.get("asset_name") or legacy_io.Session["AVALON_ASSET"] ) - self._asset_entity = get_asset(self._asset) + self._asset_entity = get_current_project_asset(self._asset) self._root_node = root_node or nuke.root() self._nodes = self.get_nodes(nodes=nodes) diff --git a/openpype/hosts/nuke/plugins/publish/validate_script.py b/openpype/hosts/nuke/plugins/publish/validate_script.py index 9bda0da85e6..b8d7494b9d2 100644 --- a/openpype/hosts/nuke/plugins/publish/validate_script.py +++ b/openpype/hosts/nuke/plugins/publish/validate_script.py @@ -1,7 +1,6 @@ import pyblish.api -from openpype.client import get_project, get_asset_by_id -from openpype import lib +from openpype.client import get_project, get_asset_by_id, get_asset_by_name from openpype.pipeline import legacy_io @@ -17,10 +16,11 @@ class ValidateScript(pyblish.api.InstancePlugin): def process(self, instance): ctx_data = instance.context.data + project_name = legacy_io.active_project() asset_name = ctx_data["asset"] - asset = lib.get_asset(asset_name) + # TODO repace query with using 'instance.data["assetEntity"]' + asset = get_asset_by_name(project_name, asset_name) asset_data = asset["data"] - project_name = legacy_io.active_project() # These attributes will be checked attributes = [ diff --git a/openpype/hosts/resolve/api/plugin.py b/openpype/hosts/resolve/api/plugin.py index 49b478fb3b1..b03125d5026 100644 --- a/openpype/hosts/resolve/api/plugin.py +++ b/openpype/hosts/resolve/api/plugin.py @@ -4,11 +4,11 @@ import qargparse from Qt import QtWidgets, QtCore -import openpype.api as pype from openpype.pipeline import ( LegacyCreator, LoaderPlugin, ) +from openpype.pipeline.context_tools import get_current_project_asset from openpype.hosts import resolve from . import lib @@ -375,7 +375,7 @@ def _get_asset_data(self): """ asset_name = self.context["representation"]["context"]["asset"] - self.data["assetData"] = pype.get_asset(asset_name)["data"] + self.data["assetData"] = get_current_project_asset(asset_name)["data"] def load(self): # create project bin for the media to be imported into diff --git a/openpype/hosts/standalonepublisher/plugins/publish/collect_editorial.py b/openpype/hosts/standalonepublisher/plugins/publish/collect_editorial.py index 0a1d29ccdc0..8633d4bf9dd 100644 --- a/openpype/hosts/standalonepublisher/plugins/publish/collect_editorial.py +++ b/openpype/hosts/standalonepublisher/plugins/publish/collect_editorial.py @@ -19,6 +19,7 @@ import opentimelineio as otio import pyblish.api from openpype import lib as plib +from openpype.pipeline.context_tools import get_current_project_asset class OTIO_View(pyblish.api.Action): @@ -116,7 +117,7 @@ def process(self, instance): if extension == ".edl": # EDL has no frame rate embedded so needs explicit # frame rate else 24 is asssumed. - kwargs["rate"] = plib.get_asset()["data"]["fps"] + kwargs["rate"] = get_current_project_asset()["data"]["fps"] instance.data["otio_timeline"] = otio.adapters.read_from_file( file_path, **kwargs) diff --git a/openpype/hosts/standalonepublisher/plugins/publish/collect_editorial_instances.py b/openpype/hosts/standalonepublisher/plugins/publish/collect_editorial_instances.py index d0d36bb717b..3237fbbe12e 100644 --- a/openpype/hosts/standalonepublisher/plugins/publish/collect_editorial_instances.py +++ b/openpype/hosts/standalonepublisher/plugins/publish/collect_editorial_instances.py @@ -1,8 +1,12 @@ import os +from copy import deepcopy + import opentimelineio as otio import pyblish.api + from openpype import lib as plib -from copy import deepcopy +from openpype.pipeline.context_tools import get_current_project_asset + class CollectInstances(pyblish.api.InstancePlugin): """Collect instances from editorial's OTIO sequence""" @@ -48,7 +52,7 @@ def process(self, instance): # get timeline otio data timeline = instance.data["otio_timeline"] - fps = plib.get_asset()["data"]["fps"] + fps = get_current_project_asset()["data"]["fps"] tracks = timeline.each_child( descended_from_type=otio.schema.Track diff --git a/openpype/hosts/standalonepublisher/plugins/publish/validate_frame_ranges.py b/openpype/hosts/standalonepublisher/plugins/publish/validate_frame_ranges.py index 005157af62b..ff7f60354eb 100644 --- a/openpype/hosts/standalonepublisher/plugins/publish/validate_frame_ranges.py +++ b/openpype/hosts/standalonepublisher/plugins/publish/validate_frame_ranges.py @@ -3,8 +3,8 @@ import pyblish.api import openpype.api -from openpype import lib from openpype.pipeline import PublishXmlValidationError +from openpype.pipeline.context_tools import get_current_project_asset class ValidateFrameRange(pyblish.api.InstancePlugin): @@ -27,7 +27,8 @@ def process(self, instance): for pattern in self.skip_timelines_check): self.log.info("Skipping for {} task".format(instance.data["task"])) - asset_data = lib.get_asset(instance.data["asset"])["data"] + # TODO repace query with using 'instance.data["assetEntity"]' + asset_data = get_current_project_asset(instance.data["asset"])["data"] frame_start = asset_data["frameStart"] frame_end = asset_data["frameEnd"] handle_start = asset_data["handleStart"] diff --git a/openpype/hosts/unreal/plugins/load/load_animation.py b/openpype/hosts/unreal/plugins/load/load_animation.py index da2830bc522..1fe0bef4623 100644 --- a/openpype/hosts/unreal/plugins/load/load_animation.py +++ b/openpype/hosts/unreal/plugins/load/load_animation.py @@ -8,13 +8,13 @@ from unreal import MovieSceneSkeletalAnimationTrack from unreal import MovieSceneSkeletalAnimationSection +from openpype.pipeline.context_tools import get_current_project_asset from openpype.pipeline import ( get_representation_path, AVALON_CONTAINER_ID ) from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline -from openpype.api import get_asset class AnimationFBXLoader(plugin.Loader): @@ -53,6 +53,8 @@ def _process(self, asset_dir, asset_name, instance_name): if not actor: return None + asset_doc = get_current_project_asset(fields=["data.fps"]) + task.set_editor_property('filename', self.fname) task.set_editor_property('destination_path', asset_dir) task.set_editor_property('destination_name', asset_name) @@ -80,7 +82,7 @@ def _process(self, asset_dir, asset_name, instance_name): task.options.anim_sequence_import_data.set_editor_property( 'use_default_sample_rate', False) task.options.anim_sequence_import_data.set_editor_property( - 'custom_sample_rate', get_asset()["data"].get("fps")) + 'custom_sample_rate', asset_doc.get("data", {}).get("fps")) task.options.anim_sequence_import_data.set_editor_property( 'import_custom_attribute', True) task.options.anim_sequence_import_data.set_editor_property( @@ -246,6 +248,7 @@ def load(self, context, name, namespace, options=None): def update(self, container, representation): name = container["asset_name"] source_path = get_representation_path(representation) + asset_doc = get_current_project_asset(fields=["data.fps"]) destination_path = container["namespace"] task = unreal.AssetImportTask() @@ -279,7 +282,7 @@ def update(self, container, representation): task.options.anim_sequence_import_data.set_editor_property( 'use_default_sample_rate', False) task.options.anim_sequence_import_data.set_editor_property( - 'custom_sample_rate', get_asset()["data"].get("fps")) + 'custom_sample_rate', asset_doc.get("data", {}).get("fps")) task.options.anim_sequence_import_data.set_editor_property( 'import_custom_attribute', True) task.options.anim_sequence_import_data.set_editor_property( diff --git a/openpype/hosts/unreal/plugins/load/load_layout.py b/openpype/hosts/unreal/plugins/load/load_layout.py index 3f16a68ead6..01d589c69b4 100644 --- a/openpype/hosts/unreal/plugins/load/load_layout.py +++ b/openpype/hosts/unreal/plugins/load/load_layout.py @@ -20,7 +20,7 @@ AVALON_CONTAINER_ID, legacy_io, ) -from openpype.api import get_asset +from openpype.pipeline.context_tools import get_current_project_asset from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline @@ -225,6 +225,7 @@ def _import_animation( anim_path = f"{asset_dir}/animations/{anim_file_name}" + asset_doc = get_current_project_asset() # Import animation task = unreal.AssetImportTask() task.options = unreal.FbxImportUI() @@ -259,7 +260,7 @@ def _import_animation( task.options.anim_sequence_import_data.set_editor_property( 'use_default_sample_rate', False) task.options.anim_sequence_import_data.set_editor_property( - 'custom_sample_rate', get_asset()["data"].get("fps")) + 'custom_sample_rate', asset_doc.get("data", {}).get("fps")) task.options.anim_sequence_import_data.set_editor_property( 'import_custom_attribute', True) task.options.anim_sequence_import_data.set_editor_property( diff --git a/openpype/lib/avalon_context.py b/openpype/lib/avalon_context.py index 2944b2506e0..2e177f195d9 100644 --- a/openpype/lib/avalon_context.py +++ b/openpype/lib/avalon_context.py @@ -236,7 +236,7 @@ def any_outdated(): return False -@with_pipeline_io +@deprecated("openpype.pipeline.context_tools.get_current_project_asset") def get_asset(asset_name=None): """ Returning asset document from database by its name. @@ -249,15 +249,9 @@ def get_asset(asset_name=None): (MongoDB document) """ - project_name = legacy_io.active_project() - if not asset_name: - asset_name = legacy_io.Session["AVALON_ASSET"] - - asset_document = get_asset_by_name(project_name, asset_name) - if not asset_document: - raise TypeError("Entity \"{}\" was not found in DB".format(asset_name)) + from openpype.pipeline.context_tools import get_current_project_asset - return asset_document + return get_current_project_asset(asset_name=asset_name) def get_system_general_anatomy_data(system_settings=None): diff --git a/openpype/lib/file_transaction.py b/openpype/lib/file_transaction.py new file mode 100644 index 00000000000..1626bec6b66 --- /dev/null +++ b/openpype/lib/file_transaction.py @@ -0,0 +1,171 @@ +import os +import logging +import sys +import errno +import six + +from openpype.lib import create_hard_link + +# this is needed until speedcopy for linux is fixed +if sys.platform == "win32": + from speedcopy import copyfile +else: + from shutil import copyfile + + +class FileTransaction(object): + """ + + The file transaction is a three step process. + + 1) Rename any existing files to a "temporary backup" during `process()` + 2) Copy the files to final destination during `process()` + 3) Remove any backed up files (*no rollback possible!) during `finalize()` + + Step 3 is done during `finalize()`. If not called the .bak files will + remain on disk. + + These steps try to ensure that we don't overwrite half of any existing + files e.g. if they are currently in use. + + Note: + A regular filesystem is *not* a transactional file system and even + though this implementation tries to produce a 'safe copy' with a + potential rollback do keep in mind that it's inherently unsafe due + to how filesystem works and a myriad of things could happen during + the transaction that break the logic. A file storage could go down, + permissions could be changed, other machines could be moving or writing + files. A lot can happen. + + Warning: + Any folders created during the transfer will not be removed. + + """ + + MODE_COPY = 0 + MODE_HARDLINK = 1 + + def __init__(self, log=None): + + if log is None: + log = logging.getLogger("FileTransaction") + + self.log = log + + # The transfer queue + # todo: make this an actual FIFO queue? + self._transfers = {} + + # Destination file paths that a file was transferred to + self._transferred = [] + + # Backup file location mapping to original locations + self._backup_to_original = {} + + def add(self, src, dst, mode=MODE_COPY): + """Add a new file to transfer queue""" + opts = {"mode": mode} + + src = os.path.abspath(src) + dst = os.path.abspath(dst) + + if dst in self._transfers: + queued_src = self._transfers[dst][0] + if src == queued_src: + self.log.debug("File transfer was already " + "in queue: {} -> {}".format(src, dst)) + return + else: + self.log.warning("File transfer in queue replaced..") + self.log.debug("Removed from queue: " + "{} -> {}".format(queued_src, dst)) + self.log.debug("Added to queue: {} -> {}".format(src, dst)) + + self._transfers[dst] = (src, opts) + + def process(self): + + # Backup any existing files + for dst in self._transfers.keys(): + if os.path.exists(dst): + # Backup original file + # todo: add timestamp or uuid to ensure unique + backup = dst + ".bak" + self._backup_to_original[backup] = dst + self.log.debug("Backup existing file: " + "{} -> {}".format(dst, backup)) + os.rename(dst, backup) + + # Copy the files to transfer + for dst, (src, opts) in self._transfers.items(): + self._create_folder_for_file(dst) + + if opts["mode"] == self.MODE_COPY: + self.log.debug("Copying file ... {} -> {}".format(src, dst)) + copyfile(src, dst) + elif opts["mode"] == self.MODE_HARDLINK: + self.log.debug("Hardlinking file ... {} -> {}".format(src, + dst)) + create_hard_link(src, dst) + + self._transferred.append(dst) + + def finalize(self): + # Delete any backed up files + for backup in self._backup_to_original.keys(): + try: + os.remove(backup) + except OSError: + self.log.error("Failed to remove backup file: " + "{}".format(backup), + exc_info=True) + + def rollback(self): + + errors = 0 + + # Rollback any transferred files + for path in self._transferred: + try: + os.remove(path) + except OSError: + errors += 1 + self.log.error("Failed to rollback created file: " + "{}".format(path), + exc_info=True) + + # Rollback the backups + for backup, original in self._backup_to_original.items(): + try: + os.rename(backup, original) + except OSError: + errors += 1 + self.log.error("Failed to restore original file: " + "{} -> {}".format(backup, original), + exc_info=True) + + if errors: + self.log.error("{} errors occurred during " + "rollback.".format(errors), exc_info=True) + six.reraise(*sys.exc_info()) + + @property + def transferred(self): + """Return the processed transfers destination paths""" + return list(self._transferred) + + @property + def backups(self): + """Return the backup file paths""" + return list(self._backup_to_original.keys()) + + def _create_folder_for_file(self, path): + dirname = os.path.dirname(path) + try: + os.makedirs(dirname) + except OSError as e: + if e.errno == errno.EEXIST: + pass + else: + self.log.critical("An unexpected error occurred.") + six.reraise(*sys.exc_info()) diff --git a/openpype/modules/deadline/plugins/publish/submit_aftereffects_deadline.py b/openpype/modules/deadline/plugins/publish/submit_aftereffects_deadline.py index de8df3dd9ef..c55f85c8da5 100644 --- a/openpype/modules/deadline/plugins/publish/submit_aftereffects_deadline.py +++ b/openpype/modules/deadline/plugins/publish/submit_aftereffects_deadline.py @@ -80,7 +80,8 @@ def get_job_info(self): "AVALON_TASK", "AVALON_APP_NAME", "OPENPYPE_DEV", - "OPENPYPE_LOG_NO_COLORS" + "OPENPYPE_LOG_NO_COLORS", + "OPENPYPE_VERSION" ] # Add mongo url if it's enabled if self._instance.context.data.get("deadlinePassMongoUrl"): diff --git a/openpype/modules/deadline/plugins/publish/submit_harmony_deadline.py b/openpype/modules/deadline/plugins/publish/submit_harmony_deadline.py index a1ee5e09574..3f9c09b5928 100644 --- a/openpype/modules/deadline/plugins/publish/submit_harmony_deadline.py +++ b/openpype/modules/deadline/plugins/publish/submit_harmony_deadline.py @@ -274,7 +274,8 @@ def get_job_info(self): "AVALON_TASK", "AVALON_APP_NAME", "OPENPYPE_DEV", - "OPENPYPE_LOG_NO_COLORS" + "OPENPYPE_LOG_NO_COLORS", + "OPENPYPE_VERSION" ] # Add mongo url if it's enabled if self._instance.context.data.get("deadlinePassMongoUrl"): diff --git a/openpype/modules/deadline/plugins/publish/submit_houdini_remote_publish.py b/openpype/modules/deadline/plugins/publish/submit_houdini_remote_publish.py index fdf67b51bcd..95856137e21 100644 --- a/openpype/modules/deadline/plugins/publish/submit_houdini_remote_publish.py +++ b/openpype/modules/deadline/plugins/publish/submit_houdini_remote_publish.py @@ -130,6 +130,7 @@ def submit_job(self, context, payload, instances, deadline): # this application with so the Render Slave can build its own # similar environment using it, e.g. "houdini17.5;pluginx2.3" "AVALON_TOOLS", + "OPENPYPE_VERSION" ] # Add mongo url if it's enabled if context.data.get("deadlinePassMongoUrl"): diff --git a/openpype/modules/deadline/plugins/publish/submit_houdini_render_deadline.py b/openpype/modules/deadline/plugins/publish/submit_houdini_render_deadline.py index aca88c74407..beda7537235 100644 --- a/openpype/modules/deadline/plugins/publish/submit_houdini_render_deadline.py +++ b/openpype/modules/deadline/plugins/publish/submit_houdini_render_deadline.py @@ -101,6 +101,7 @@ def process(self, instance): # this application with so the Render Slave can build its own # similar environment using it, e.g. "maya2018;vray4.x;yeti3.1.9" "AVALON_TOOLS", + "OPENPYPE_VERSION" ] # Add mongo url if it's enabled if context.data.get("deadlinePassMongoUrl"): diff --git a/openpype/modules/deadline/plugins/publish/submit_maya_deadline.py b/openpype/modules/deadline/plugins/publish/submit_maya_deadline.py index 3707c5709fd..fca044e348b 100644 --- a/openpype/modules/deadline/plugins/publish/submit_maya_deadline.py +++ b/openpype/modules/deadline/plugins/publish/submit_maya_deadline.py @@ -524,7 +524,8 @@ def process(self, instance): "AVALON_TASK", "AVALON_APP_NAME", "OPENPYPE_DEV", - "OPENPYPE_LOG_NO_COLORS" + "OPENPYPE_LOG_NO_COLORS", + "OPENPYPE_VERSION" ] # Add mongo url if it's enabled if instance.context.data.get("deadlinePassMongoUrl"): diff --git a/openpype/modules/deadline/plugins/publish/submit_maya_remote_publish_deadline.py b/openpype/modules/deadline/plugins/publish/submit_maya_remote_publish_deadline.py index 57572fcb24e..9b1852392b6 100644 --- a/openpype/modules/deadline/plugins/publish/submit_maya_remote_publish_deadline.py +++ b/openpype/modules/deadline/plugins/publish/submit_maya_remote_publish_deadline.py @@ -102,7 +102,8 @@ def process(self, instance): keys = [ "FTRACK_API_USER", "FTRACK_API_KEY", - "FTRACK_SERVER" + "FTRACK_SERVER", + "OPENPYPE_VERSION" ] environment = dict({key: os.environ[key] for key in keys if key in os.environ}, **legacy_io.Session) diff --git a/openpype/modules/deadline/plugins/publish/submit_nuke_deadline.py b/openpype/modules/deadline/plugins/publish/submit_nuke_deadline.py index 93fb511a34f..a328c3633d3 100644 --- a/openpype/modules/deadline/plugins/publish/submit_nuke_deadline.py +++ b/openpype/modules/deadline/plugins/publish/submit_nuke_deadline.py @@ -261,7 +261,8 @@ def payload_submit( "PYBLISHPLUGINPATH", "NUKE_PATH", "TOOL_ENV", - "FOUNDRY_LICENSE" + "FOUNDRY_LICENSE", + "OPENPYPE_VERSION" ] # Add mongo url if it's enabled if instance.context.data.get("deadlinePassMongoUrl"): diff --git a/openpype/modules/deadline/plugins/publish/submit_publish_job.py b/openpype/modules/deadline/plugins/publish/submit_publish_job.py index b098eaba8e1..43c3f3943f4 100644 --- a/openpype/modules/deadline/plugins/publish/submit_publish_job.py +++ b/openpype/modules/deadline/plugins/publish/submit_publish_job.py @@ -139,7 +139,8 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin): "OPENPYPE_USERNAME", "OPENPYPE_RENDER_JOB", "OPENPYPE_PUBLISH_JOB", - "OPENPYPE_MONGO" + "OPENPYPE_MONGO", + "OPENPYPE_VERSION" ] # custom deadline attributes diff --git a/openpype/modules/deadline/repository/custom/plugins/GlobalJobPreLoad.py b/openpype/modules/deadline/repository/custom/plugins/GlobalJobPreLoad.py index bcd853f374b..a43c6c7733f 100644 --- a/openpype/modules/deadline/repository/custom/plugins/GlobalJobPreLoad.py +++ b/openpype/modules/deadline/repository/custom/plugins/GlobalJobPreLoad.py @@ -6,13 +6,29 @@ import json import platform import uuid -from Deadline.Scripting import RepositoryUtils, FileUtils +import re +from Deadline.Scripting import RepositoryUtils, FileUtils, DirectoryUtils + + +def get_openpype_version_from_path(path): + version_file = os.path.join(path, "openpype", "version.py") + if not os.path.isfile(version_file): + return None + version = {} + with open(version_file, "r") as vf: + exec(vf.read(), version) + + version_match = re.search(r"(\d+\.\d+.\d+).*", version["__version__"]) + return version_match[1] def get_openpype_executable(): """Return OpenPype Executable from Event Plug-in Settings""" config = RepositoryUtils.GetPluginConfig("OpenPype") - return config.GetConfigEntryWithDefault("OpenPypeExecutable", "") + exe_list = config.GetConfigEntryWithDefault("OpenPypeExecutable", "") + dir_list = config.GetConfigEntryWithDefault( + "OpenPypeInstallationDirs", "") + return exe_list, dir_list def inject_openpype_environment(deadlinePlugin): @@ -25,16 +41,71 @@ def inject_openpype_environment(deadlinePlugin): print(">>> Injecting OpenPype environments ...") try: print(">>> Getting OpenPype executable ...") - exe_list = get_openpype_executable() - openpype_app = FileUtils.SearchFileList(exe_list) - if openpype_app == "": + exe_list, dir_list = get_openpype_executable() + openpype_versions = [] + # if the job requires specific OpenPype version, + # lets go over all available and find compatible build. + requested_version = job.GetJobEnvironmentKeyValue("OPENPYPE_VERSION") + if requested_version: + print(("Scanning for compatible requested " + f"version {requested_version}")) + install_dir = DirectoryUtils.SearchDirectoryList(dir_list) + if dir: + sub_dirs = [ + f.path for f in os.scandir(install_dir) + if f.is_dir() + ] + for subdir in sub_dirs: + version = get_openpype_version_from_path(subdir) + if not version: + continue + openpype_versions.append((version, subdir)) + + exe = FileUtils.SearchFileList(exe_list) + if openpype_versions: + # if looking for requested compatible version, + # add the implicitly specified to the list too. + version = get_openpype_version_from_path( + os.path.dirname(exe)) + if version: + openpype_versions.append((version, os.path.dirname(exe))) + + if requested_version: + # sort detected versions + if openpype_versions: + openpype_versions.sort(key=lambda ver: ver[0]) + requested_major, requested_minor, _ = requested_version.split(".")[:3] # noqa: E501 + compatible_versions = [] + for version in openpype_versions: + v = version[0].split(".")[:3] + if v[0] == requested_major and v[1] == requested_minor: + compatible_versions.append(version) + if not compatible_versions: + raise RuntimeError( + ("Cannot find compatible version available " + "for version {} requested by the job. " + "Please add it through plugin configuration " + "in Deadline or install it to configured " + "directory.").format(requested_version)) + # sort compatible versions nad pick the last one + compatible_versions.sort(key=lambda ver: ver[0]) + # create list of executables for different platform and let + # Deadline decide. + exe_list = [ + os.path.join( + compatible_versions[-1][1], "openpype_console.exe"), + os.path.join( + compatible_versions[-1][1], "openpype_console") + ] + exe = FileUtils.SearchFileList(";".join(exe_list)) + if exe == "": raise RuntimeError( "OpenPype executable was not found " + "in the semicolon separated list \"" + exe_list + "\". " + "The path to the render executable can be configured " + "from the Plugin Configuration in the Deadline Monitor.") - print("--- OpenPype executable: {}".format(openpype_app)) + print("--- OpenPype executable: {}".format(exe)) # tempfile.TemporaryFile cannot be used because of locking temp_file_name = "{}_{}.json".format( @@ -45,7 +116,7 @@ def inject_openpype_environment(deadlinePlugin): print(">>> Temporary path: {}".format(export_url)) args = [ - openpype_app, + exe, "--headless", 'extractenvironments', export_url @@ -77,7 +148,7 @@ def inject_openpype_environment(deadlinePlugin): print(">>> Executing: {}".format(args)) std_output = subprocess.check_output(args, - cwd=os.path.dirname(openpype_app), + cwd=os.path.dirname(exe), env=env) print(">>> Process result {}".format(std_output)) diff --git a/openpype/modules/deadline/repository/custom/plugins/OpenPype/OpenPype.param b/openpype/modules/deadline/repository/custom/plugins/OpenPype/OpenPype.param index 8bd6dce12d0..b3ac18e20ca 100644 --- a/openpype/modules/deadline/repository/custom/plugins/OpenPype/OpenPype.param +++ b/openpype/modules/deadline/repository/custom/plugins/OpenPype/OpenPype.param @@ -7,11 +7,20 @@ Index=0 Default=OpenPype Plugin for Deadline Description=Not configurable +[OpenPypeInstallationDirs] +Type=multilinemultifolder +Label=Directories where OpenPype versions are installed +Category=OpenPype Installation Directories +CategoryOrder=0 +Index=0 +Default=C:\Program Files (x86)\OpenPype +Description=Path or paths to directories where multiple versions of OpenPype might be installed. Enter every such path on separate lines. + [OpenPypeExecutable] Type=multilinemultifilename Label=OpenPype Executable Category=OpenPype Executables -CategoryOrder=0 +CategoryOrder=1 Index=0 Default= Description=The path to the OpenPype executable. Enter alternative paths on separate lines. diff --git a/openpype/modules/deadline/repository/custom/plugins/OpenPype/OpenPype.py b/openpype/modules/deadline/repository/custom/plugins/OpenPype/OpenPype.py index 451d71fb630..b84560f1752 100644 --- a/openpype/modules/deadline/repository/custom/plugins/OpenPype/OpenPype.py +++ b/openpype/modules/deadline/repository/custom/plugins/OpenPype/OpenPype.py @@ -1,10 +1,18 @@ +#!/usr/bin/env python3 + from System.IO import Path from System.Text.RegularExpressions import Regex from Deadline.Plugins import PluginType, DeadlinePlugin -from Deadline.Scripting import StringUtils, FileUtils, RepositoryUtils +from Deadline.Scripting import ( + StringUtils, + FileUtils, + DirectoryUtils, + RepositoryUtils +) import re +import os ###################################################################### @@ -52,13 +60,83 @@ def InitializeProcess(self): self.AddStdoutHandlerCallback( ".*Progress: (\d+)%.*").HandleCallback += self.HandleProgress + @staticmethod + def get_openpype_version_from_path(path): + version_file = os.path.join(path, "openpype", "version.py") + if not os.path.isfile(version_file): + return None + version = {} + with open(version_file, "r") as vf: + exec(vf.read(), version) + + version_match = re.search(r"(\d+\.\d+.\d+).*", version["__version__"]) + return version_match[1] + def RenderExecutable(self): - exeList = self.GetConfigEntry("OpenPypeExecutable") - exe = FileUtils.SearchFileList(exeList) + job = self.GetJob() + openpype_versions = [] + # if the job requires specific OpenPype version, + # lets go over all available and find compatible build. + requested_version = job.GetJobEnvironmentKeyValue("OPENPYPE_VERSION") + if requested_version: + self.LogInfo(( + "Scanning for compatible requested " + f"version {requested_version}")) + dir_list = self.GetConfigEntry("OpenPypeInstallationDirs") + install_dir = DirectoryUtils.SearchDirectoryList(dir_list) + if dir: + sub_dirs = [ + f.path for f in os.scandir(install_dir) + if f.is_dir() + ] + for subdir in sub_dirs: + version = self.get_openpype_version_from_path(subdir) + if not version: + continue + openpype_versions.append((version, subdir)) + + exe_list = self.GetConfigEntry("OpenPypeExecutable") + exe = FileUtils.SearchFileList(exe_list) + if openpype_versions: + # if looking for requested compatible version, + # add the implicitly specified to the list too. + version = self.get_openpype_version_from_path( + os.path.dirname(exe)) + if version: + openpype_versions.append((version, os.path.dirname(exe))) + + if requested_version: + # sort detected versions + if openpype_versions: + openpype_versions.sort(key=lambda ver: ver[0]) + requested_major, requested_minor, _ = requested_version.split(".")[:3] # noqa: E501 + compatible_versions = [] + for version in openpype_versions: + v = version[0].split(".")[:3] + if v[0] == requested_major and v[1] == requested_minor: + compatible_versions.append(version) + if not compatible_versions: + self.FailRender(("Cannot find compatible version available " + "for version {} requested by the job. " + "Please add it through plugin configuration " + "in Deadline or install it to configured " + "directory.").format(requested_version)) + # sort compatible versions nad pick the last one + compatible_versions.sort(key=lambda ver: ver[0]) + # create list of executables for different platform and let + # Deadline decide. + exe_list = [ + os.path.join( + compatible_versions[-1][1], "openpype_console.exe"), + os.path.join( + compatible_versions[-1][1], "openpype_console") + ] + exe = FileUtils.SearchFileList(";".join(exe_list)) + if exe == "": self.FailRender( "OpenPype executable was not found " + - "in the semicolon separated list \"" + exeList + "\". " + + "in the semicolon separated list \"" + exe_list + "\". " + "The path to the render executable can be configured " + "from the Plugin Configuration in the Deadline Monitor.") return exe diff --git a/openpype/pipeline/context_tools.py b/openpype/pipeline/context_tools.py index e719e465144..80ad939ccd1 100644 --- a/openpype/pipeline/context_tools.py +++ b/openpype/pipeline/context_tools.py @@ -10,6 +10,11 @@ from pyblish.lib import MessageHandler import openpype +from openpype.client import ( + get_project, + get_asset_by_id, + get_asset_by_name, +) from openpype.modules import load_modules, ModulesManager from openpype.settings import get_project_settings from openpype.lib import filter_pyblish_plugins @@ -240,29 +245,7 @@ def registered_host(): def deregister_host(): - _registered_host["_"] = default_host() - - -def default_host(): - """A default host, in place of anything better - - This may be considered as reference for the - interface a host must implement. It also ensures - that the system runs, even when nothing is there - to support it. - - """ - - host = types.ModuleType("defaultHost") - - def ls(): - return list() - - host.__dict__.update({ - "ls": ls - }) - - return host + _registered_host["_"] = None def debug_host(): @@ -304,3 +287,50 @@ def ls(): }) return host + + +def get_current_project(fields=None): + """Helper function to get project document based on global Session. + + This function should be called only in process where host is installed. + + Returns: + dict: Project document. + None: Project is not set. + """ + + project_name = legacy_io.active_project() + return get_project(project_name, fields=fields) + + +def get_current_project_asset(asset_name=None, asset_id=None, fields=None): + """Helper function to get asset document based on global Session. + + This function should be called only in process where host is installed. + + Asset is found out based on passed asset name or id (not both). Asset name + is not used for filtering if asset id is passed. When both asset name and + id are missing then asset name from current process is used. + + Args: + asset_name (str): Name of asset used for filter. + asset_id (Union[str, ObjectId]): Asset document id. If entered then + is used as only filter. + fields (Union[List[str], None]): Limit returned data of asset documents + to specific keys. + + Returns: + dict: Asset document. + None: Asset is not set or not exist. + """ + + project_name = legacy_io.active_project() + if asset_id: + return get_asset_by_id(project_name, asset_id, fields=fields) + + if not asset_name: + asset_name = legacy_io.Session.get("AVALON_ASSET") + # Skip if is not set even on context + if not asset_name: + return None + return get_asset_by_name(project_name, asset_name, fields=fields) diff --git a/openpype/plugins/publish/extract_review_slate.py b/openpype/plugins/publish/extract_review_slate.py index 28685c2e90c..69043ee2618 100644 --- a/openpype/plugins/publish/extract_review_slate.py +++ b/openpype/plugins/publish/extract_review_slate.py @@ -285,36 +285,34 @@ def process(self, instance): audio_channels, audio_sample_rate, audio_channel_layout, + input_frame_rate ) # replace slate with silent slate for concat slate_v_path = slate_silent_path - # create ffmpeg concat text file path - conc_text_file = input_file.replace(ext, "") + "_concat" + ".txt" - conc_text_path = os.path.join( - os.path.normpath(stagingdir), conc_text_file) - _remove_at_end.append(conc_text_path) - self.log.debug("__ conc_text_path: {}".format(conc_text_path)) - - new_line = "\n" - with open(conc_text_path, "w") as conc_text_f: - conc_text_f.writelines([ - "file {}".format( - slate_v_path.replace("\\", "/")), - new_line, - "file {}".format(input_path.replace("\\", "/")) - ]) - - # concat slate and videos together + # concat slate and videos together with concat filter + # this will reencode the output + if input_audio: + fmap = [ + "-filter_complex", + "[0:v] [0:a] [1:v] [1:a] concat=n=2:v=1:a=1 [v] [a]", + "-map", '[v]', + "-map", '[a]' + ] + else: + fmap = [ + "-filter_complex", + "[0:v] [1:v] concat=n=2:v=1:a=0 [v]", + "-map", '[v]' + ] concat_args = [ ffmpeg_path, "-y", - "-f", "concat", - "-safe", "0", - "-i", conc_text_path, - "-c", "copy", + "-i", slate_v_path, + "-i", input_path, ] + concat_args.extend(fmap) if offset_timecode: concat_args.extend(["-timecode", offset_timecode]) # NOTE: Added because of OP Atom demuxers @@ -322,12 +320,18 @@ def process(self, instance): # - keep format of output if format_args: concat_args.extend(format_args) + + if codec_args: + concat_args.extend(codec_args) + # Use arguments from ffmpeg preset source_ffmpeg_cmd = repre.get("ffmpeg_cmd") if source_ffmpeg_cmd: copy_args = ( "-metadata", "-metadata:s:v:0", + "-b:v", + "-b:a", ) args = source_ffmpeg_cmd.split(" ") for indx, arg in enumerate(args): @@ -335,12 +339,14 @@ def process(self, instance): concat_args.append(arg) # assumes arg has one parameter concat_args.append(args[indx + 1]) + # add final output path concat_args.append(output_path) # ffmpeg concat subprocess self.log.debug( - "Executing concat: {}".format(" ".join(concat_args)) + "Executing concat filter: {}".format + (" ".join(concat_args)) ) openpype.api.run_subprocess( concat_args, logger=self.log @@ -488,9 +494,10 @@ def _create_silent_slate( audio_channels, audio_sample_rate, audio_channel_layout, + input_frame_rate ): # Get duration of one frame in micro seconds - items = audio_sample_rate.split("/") + items = input_frame_rate.split("/") if len(items) == 1: one_frame_duration = 1.0 / float(items[0]) elif len(items) == 2: diff --git a/openpype/plugins/publish/integrate.py b/openpype/plugins/publish/integrate.py new file mode 100644 index 00000000000..8532691e613 --- /dev/null +++ b/openpype/plugins/publish/integrate.py @@ -0,0 +1,908 @@ +import os +import logging +import sys +import copy +import clique +import six + +from bson.objectid import ObjectId +from pymongo import DeleteMany, ReplaceOne, InsertOne, UpdateOne +import pyblish.api + +import openpype.api +from openpype.lib.profiles_filtering import filter_profiles +from openpype.lib.file_transaction import FileTransaction +from openpype.pipeline import legacy_io +from openpype.pipeline.publish import KnownPublishError + +log = logging.getLogger(__name__) + + +def assemble(files): + """Convenience `clique.assemble` wrapper for files of a single collection. + + Unlike `clique.assemble` this wrapper does not allow more than a single + Collection nor any remainder files. Errors will be raised when not only + a single collection is assembled. + + Returns: + clique.Collection: A single sequence Collection + + Raises: + ValueError: Error is raised when files do not result in a single + collected Collection. + + """ + # todo: move this to lib? + # Get the sequence as a collection. The files must be of a single + # sequence and have no remainder outside of the collections. + patterns = [clique.PATTERNS["frames"]] + collections, remainder = clique.assemble(files, + minimum_items=1, + patterns=patterns) + if not collections: + raise ValueError("No collections found in files: " + "{}".format(files)) + if remainder: + raise ValueError("Files found not detected as part" + " of a sequence: {}".format(remainder)) + if len(collections) > 1: + raise ValueError("Files in sequence are not part of a" + " single sequence collection: " + "{}".format(collections)) + return collections[0] + + +def get_instance_families(instance): + """Get all families of the instance""" + # todo: move this to lib? + family = instance.data.get("family") + families = [] + if family: + families.append(family) + + for _family in (instance.data.get("families") or []): + if _family not in families: + families.append(_family) + + return families + + +def get_frame_padded(frame, padding): + """Return frame number as string with `padding` amount of padded zeros""" + return "{frame:0{padding}d}".format(padding=padding, frame=frame) + + +def get_first_frame_padded(collection): + """Return first frame as padded number from `clique.Collection`""" + start_frame = next(iter(collection.indexes)) + return get_frame_padded(start_frame, padding=collection.padding) + + +class IntegrateAsset(pyblish.api.InstancePlugin): + """Register publish in the database and transfer files to destinations. + + Steps: + 1) Register the subset and version + 2) Transfer the representation files to the destination + 3) Register the representation + + Requires: + instance.data['representations'] - must be a list and each member + must be a dictionary with following data: + 'files': list of filenames for sequence, string for single file. + Only the filename is allowed, without the folder path. + 'stagingDir': "path/to/folder/with/files" + 'name': representation name (usually the same as extension) + 'ext': file extension + optional data + "frameStart" + "frameEnd" + 'fps' + "data": additional metadata for each representation. + """ + + label = "Integrate Asset" + order = pyblish.api.IntegratorOrder + families = ["workfile", + "pointcache", + "camera", + "animation", + "model", + "mayaAscii", + "mayaScene", + "setdress", + "layout", + "ass", + "vdbcache", + "scene", + "vrayproxy", + "vrayscene_layer", + "render", + "prerender", + "imagesequence", + "review", + "rendersetup", + "rig", + "plate", + "look", + "audio", + "yetiRig", + "yeticache", + "nukenodes", + "gizmo", + "source", + "matchmove", + "image", + "assembly", + "fbx", + "textures", + "action", + "harmony.template", + "harmony.palette", + "editorial", + "background", + "camerarig", + "redshiftproxy", + "effect", + "xgen", + "hda", + "usd", + "staticMesh", + "skeletalMesh", + "mvLook", + "mvUsd", + "mvUsdComposition", + "mvUsdOverride", + "simpleUnrealTexture" + ] + exclude_families = ["clip", "render.farm"] + default_template_name = "publish" + + # Representation context keys that should always be written to + # the database even if not used by the destination template + db_representation_context_keys = [ + "project", "asset", "task", "subset", "version", "representation", + "family", "hierarchy", "username" + ] + skip_host_families = [] + + def process(self, instance): + if self._temp_skip_instance_by_settings(instance): + return + + # Mark instance as processed for legacy integrator + instance.data["processedWithNewIntegrator"] = True + + # Instance should be integrated on a farm + if instance.data.get("farm"): + self.log.info( + "Instance is marked to be processed on farm. Skipping") + return + + filtered_repres = self.filter_representations(instance) + # Skip instance if there are not representations to integrate + # all representations should not be integrated + if not filtered_repres: + self.log.warning(( + "Skipping, there are no representations" + " to integrate for instance {}" + ).format(instance.data["family"])) + return + + # Exclude instances that also contain families from exclude families + families = set(get_instance_families(instance)) + exclude = families & set(self.exclude_families) + if exclude: + self.log.debug("Instance not integrated due to exclude " + "families found: {}".format(", ".join(exclude))) + return + + file_transactions = FileTransaction(log=self.log) + try: + self.register(instance, file_transactions, filtered_repres) + except Exception: + # clean destination + # todo: preferably we'd also rollback *any* changes to the database + file_transactions.rollback() + self.log.critical("Error when registering", exc_info=True) + six.reraise(*sys.exc_info()) + + # Finalizing can't rollback safely so no use for moving it to + # the try, except. + file_transactions.finalize() + + def _temp_skip_instance_by_settings(self, instance): + """Decide if instance will be processed with new or legacy integrator. + + This is temporary solution until we test all usecases with new (this) + integrator plugin. + """ + + host_name = instance.context.data["hostName"] + instance_family = instance.data["family"] + instance_families = set(instance.data.get("families") or []) + + skip = False + for item in self.skip_host_families: + if host_name not in item["host"]: + continue + + families = set(item["families"]) + if instance_family in families: + skip = True + break + + for family in instance_families: + if family in families: + skip = True + break + + if skip: + break + + if skip: + self.log.debug("Instance is marked to be skipped by settings.") + return skip + + def filter_representations(self, instance): + # Prepare repsentations that should be integrated + repres = instance.data.get("representations") + # Raise error if instance don't have any representations + if not repres: + raise KnownPublishError( + "Instance {} has no representations to integrate".format( + instance.data["family"] + ) + ) + + # Validate type of stored representations + if not isinstance(repres, (list, tuple)): + raise TypeError( + "Instance 'files' must be a list, got: {0} {1}".format( + str(type(repres)), str(repres) + ) + ) + + # Filter representations + filtered_repres = [] + for repre in repres: + if "delete" in repre.get("tags", []): + continue + filtered_repres.append(repre) + + return filtered_repres + + def register(self, instance, file_transactions, filtered_repres): + instance_stagingdir = instance.data.get("stagingDir") + if not instance_stagingdir: + self.log.info(( + "{0} is missing reference to staging directory." + " Will try to get it from representation." + ).format(instance)) + + else: + self.log.debug( + "Establishing staging directory " + "@ {0}".format(instance_stagingdir) + ) + + template_name = self.get_template_name(instance) + + subset, subset_writes = self.prepare_subset(instance) + version, version_writes = self.prepare_version(instance, subset) + instance.data["versionEntity"] = version + + # Get existing representations (if any) + existing_repres_by_name = { + repres["name"].lower(): repres for repres in legacy_io.find( + { + "parent": version["_id"], + "type": "representation" + }, + # Only care about id and name of existing representations + projection={"_id": True, "name": True} + ) + } + + # Prepare all representations + prepared_representations = [] + for repre in filtered_repres: + # todo: reduce/simplify what is returned from this function + prepared = self.prepare_representation( + repre, + template_name, + existing_repres_by_name, + version, + instance_stagingdir, + instance) + + for src, dst in prepared["transfers"]: + # todo: add support for hardlink transfers + file_transactions.add(src, dst) + + prepared_representations.append(prepared) + + # Each instance can also have pre-defined transfers not explicitly + # part of a representation - like texture resources used by a + # .ma representation. Those destination paths are pre-defined, etc. + # todo: should we move or simplify this logic? + resource_destinations = set() + for src, dst in instance.data.get("transfers", []): + file_transactions.add(src, dst, mode=FileTransaction.MODE_COPY) + resource_destinations.add(os.path.abspath(dst)) + + for src, dst in instance.data.get("hardlinks", []): + file_transactions.add(src, dst, mode=FileTransaction.MODE_HARDLINK) + resource_destinations.add(os.path.abspath(dst)) + + # Bulk write to the database + # We write the subset and version to the database before the File + # Transaction to reduce the chances of another publish trying to + # publish to the same version number since that chance can greatly + # increase if the file transaction takes a long time. + legacy_io.bulk_write(subset_writes + version_writes) + self.log.info("Subset {subset[name]} and Version {version[name]} " + "written to database..".format(subset=subset, + version=version)) + + # Process all file transfers of all integrations now + self.log.debug("Integrating source files to destination ...") + file_transactions.process() + self.log.debug( + "Backed up existing files: {}".format(file_transactions.backups)) + self.log.debug( + "Transferred files: {}".format(file_transactions.transferred)) + self.log.debug("Retrieving Representation Site Sync information ...") + + # Get the accessible sites for Site Sync + modules_by_name = instance.context.data["openPypeModules"] + sync_server_module = modules_by_name["sync_server"] + sites = sync_server_module.compute_resource_sync_sites( + project_name=instance.data["projectEntity"]["name"] + ) + self.log.debug("Sync Server Sites: {}".format(sites)) + + # Compute the resource file infos once (files belonging to the + # version instance instead of an individual representation) so + # we can re-use those file infos per representation + anatomy = instance.context.data["anatomy"] + resource_file_infos = self.get_files_info(resource_destinations, + sites=sites, + anatomy=anatomy) + + # Finalize the representations now the published files are integrated + # Get 'files' info for representations and its attached resources + representation_writes = [] + new_repre_names_low = set() + for prepared in prepared_representations: + representation = prepared["representation"] + transfers = prepared["transfers"] + destinations = [dst for src, dst in transfers] + representation["files"] = self.get_files_info( + destinations, sites=sites, anatomy=anatomy + ) + + # Add the version resource file infos to each representation + representation["files"] += resource_file_infos + + # Set up representation for writing to the database. Since + # we *might* be overwriting an existing entry if the version + # already existed we'll use ReplaceOnce with `upsert=True` + representation_writes.append(ReplaceOne( + filter={"_id": representation["_id"]}, + replacement=representation, + upsert=True + )) + + new_repre_names_low.add(representation["name"].lower()) + + # Delete any existing representations that didn't get any new data + # if the instance is not set to append mode + if not instance.data.get("append", False): + delete_names = set() + for name, existing_repres in existing_repres_by_name.items(): + if name not in new_repre_names_low: + # We add the exact representation name because `name` is + # lowercase for name matching only and not in the database + delete_names.add(existing_repres["name"]) + if delete_names: + representation_writes.append(DeleteMany( + filter={ + "parent": version["_id"], + "name": {"$in": list(delete_names)} + } + )) + + # Write representations to the database + legacy_io.bulk_write(representation_writes) + + # Backwards compatibility + # todo: can we avoid the need to store this? + instance.data["published_representations"] = { + p["representation"]["_id"]: p for p in prepared_representations + } + + self.log.info("Registered {} representations" + "".format(len(prepared_representations))) + + def prepare_subset(self, instance): + asset = instance.data.get("assetEntity") + subset_name = instance.data["subset"] + self.log.debug("Subset: {}".format(subset_name)) + + # Get existing subset if it exists + subset = legacy_io.find_one({ + "type": "subset", + "parent": asset["_id"], + "name": subset_name + }) + + # Define subset data + data = { + "families": get_instance_families(instance) + } + + subset_group = instance.data.get("subsetGroup") + if subset_group: + data["subsetGroup"] = subset_group + + bulk_writes = [] + if subset is None: + # Create a new subset + self.log.info("Subset '%s' not found, creating ..." % subset_name) + subset = { + "_id": ObjectId(), + "schema": "openpype:subset-3.0", + "type": "subset", + "name": subset_name, + "data": data, + "parent": asset["_id"] + } + bulk_writes.append(InsertOne(subset)) + + else: + # Update existing subset data with new data and set in database. + # We also change the found subset in-place so we don't need to + # re-query the subset afterwards + subset["data"].update(data) + bulk_writes.append(UpdateOne( + {"type": "subset", "_id": subset["_id"]}, + {"$set": { + "data": subset["data"] + }} + )) + + self.log.info("Prepared subset: {}".format(subset_name)) + return subset, bulk_writes + + def prepare_version(self, instance, subset): + + version_number = instance.data["version"] + + version = { + "schema": "openpype:version-3.0", + "type": "version", + "parent": subset["_id"], + "name": version_number, + "data": self.create_version_data(instance) + } + + existing_version = legacy_io.find_one({ + 'type': 'version', + 'parent': subset["_id"], + 'name': version_number + }, projection={"_id": True}) + + if existing_version: + self.log.debug("Updating existing version ...") + version["_id"] = existing_version["_id"] + else: + self.log.debug("Creating new version ...") + version["_id"] = ObjectId() + + bulk_writes = [ReplaceOne( + filter={"_id": version["_id"]}, + replacement=version, + upsert=True + )] + + self.log.info("Prepared version: v{0:03d}".format(version["name"])) + + return version, bulk_writes + + def prepare_representation(self, repre, + template_name, + existing_repres_by_name, + version, + instance_stagingdir, + instance): + + # pre-flight validations + if repre["ext"].startswith("."): + raise ValueError("Extension must not start with a dot '.': " + "{}".format(repre["ext"])) + + if repre.get("transfers"): + raise ValueError("Representation is not allowed to have transfers" + "data before integration. They are computed in " + "the integrator" + "Got: {}".format(repre["transfers"])) + + # create template data for Anatomy + template_data = copy.deepcopy(instance.data["anatomyData"]) + + # required representation keys + files = repre['files'] + template_data["representation"] = repre["name"] + template_data["ext"] = repre["ext"] + + # optionals + # retrieve additional anatomy data from representation if exists + for key, anatomy_key in { + # Representation Key: Anatomy data key + "resolutionWidth": "resolution_width", + "resolutionHeight": "resolution_height", + "fps": "fps", + "outputName": "output", + "originalBasename": "originalBasename" + }.items(): + # Allow to take value from representation + # if not found also consider instance.data + if key in repre: + value = repre[key] + elif key in instance.data: + value = instance.data[key] + else: + continue + template_data[anatomy_key] = value + + if repre.get('stagingDir'): + stagingdir = repre['stagingDir'] + else: + # Fall back to instance staging dir if not explicitly + # set for representation in the instance + self.log.debug("Representation uses instance staging dir: " + "{}".format(instance_stagingdir)) + stagingdir = instance_stagingdir + if not stagingdir: + raise ValueError("No staging directory set for representation: " + "{}".format(repre)) + + self.log.debug("Anatomy template name: {}".format(template_name)) + anatomy = instance.context.data['anatomy'] + template = os.path.normpath(anatomy.templates[template_name]["path"]) + + is_udim = bool(repre.get("udim")) + is_sequence_representation = isinstance(files, (list, tuple)) + if is_sequence_representation: + # Collection of files (sequence) + assert not any(os.path.isabs(fname) for fname in files), ( + "Given file names contain full paths" + ) + + src_collection = assemble(files) + + # If the representation has `frameStart` set it renumbers the + # frame indices of the published collection. It will start from + # that `frameStart` index instead. Thus if that frame start + # differs from the collection we want to shift the destination + # frame indices from the source collection. + destination_indexes = list(src_collection.indexes) + destination_padding = len(get_first_frame_padded(src_collection)) + if repre.get("frameStart") is not None and not is_udim: + index_frame_start = int(repre.get("frameStart")) + + render_template = anatomy.templates[template_name] + # todo: should we ALWAYS manage the frame padding even when not + # having `frameStart` set? + frame_start_padding = int( + render_template.get( + "frame_padding", + render_template.get("padding") + ) + ) + + # Shift destination sequence to the start frame + src_start_frame = next(iter(src_collection.indexes)) + shift = index_frame_start - src_start_frame + if shift: + destination_indexes = [ + frame + shift for frame in destination_indexes + ] + destination_padding = frame_start_padding + + # To construct the destination template with anatomy we require + # a Frame or UDIM tile set for the template data. We use the first + # index of the destination for that because that could've shifted + # from the source indexes, etc. + first_index_padded = get_frame_padded(frame=destination_indexes[0], + padding=destination_padding) + if is_udim: + # UDIM representations handle ranges in a different manner + template_data["udim"] = first_index_padded + else: + template_data["frame"] = first_index_padded + + # Construct destination collection from template + anatomy_filled = anatomy.format(template_data) + template_filled = anatomy_filled[template_name]["path"] + repre_context = template_filled.used_values + self.log.debug("Template filled: {}".format(str(template_filled))) + dst_collection = assemble([os.path.normpath(template_filled)]) + + # Update the destination indexes and padding + dst_collection.indexes.clear() + dst_collection.indexes.update(set(destination_indexes)) + dst_collection.padding = destination_padding + assert ( + len(src_collection.indexes) == len(dst_collection.indexes) + ), "This is a bug" + + # Multiple file transfers + transfers = [] + for src_file_name, dst in zip(src_collection, dst_collection): + src = os.path.join(stagingdir, src_file_name) + transfers.append((src, dst)) + + else: + # Single file + fname = files + assert not os.path.isabs(fname), ( + "Given file name is a full path" + ) + + # Manage anatomy template data + template_data.pop("frame", None) + if is_udim: + template_data["udim"] = repre["udim"][0] + + # Construct destination filepath from template + anatomy_filled = anatomy.format(template_data) + template_filled = anatomy_filled[template_name]["path"] + repre_context = template_filled.used_values + dst = os.path.normpath(template_filled) + + # Single file transfer + src = os.path.join(stagingdir, fname) + transfers = [(src, dst)] + + # todo: Are we sure the assumption each representation + # ends up in the same folder is valid? + if not instance.data.get("publishDir"): + instance.data["publishDir"] = ( + anatomy_filled + [template_name] + ["folder"] + ) + + for key in self.db_representation_context_keys: + # Also add these values to the context even if not used by the + # destination template + value = template_data.get(key) + if not value: + continue + repre_context[key] = template_data[key] + + # Explicitly store the full list even though template data might + # have a different value because it uses just a single udim tile + if repre.get("udim"): + repre_context["udim"] = repre.get("udim") # store list + + # Use previous representation's id if there is a name match + existing = existing_repres_by_name.get(repre["name"].lower()) + if existing: + repre_id = existing["_id"] + else: + repre_id = ObjectId() + + # Backwards compatibility: + # Store first transferred destination as published path data + # todo: can we remove this? + # todo: We shouldn't change data that makes its way back into + # instance.data[] until we know the publish actually succeeded + # otherwise `published_path` might not actually be valid? + published_path = transfers[0][1] + repre["published_path"] = published_path # Backwards compatibility + + # todo: `repre` is not the actual `representation` entity + # we should simplify/clarify difference between data above + # and the actual representation entity for the database + data = repre.get("data", {}) + data.update({'path': published_path, 'template': template}) + representation = { + "_id": repre_id, + "schema": "openpype:representation-2.0", + "type": "representation", + "parent": version["_id"], + "name": repre['name'], + "data": data, + + # Imprint shortcut to context for performance reasons. + "context": repre_context + } + + # todo: simplify/streamline which additional data makes its way into + # the representation context + if repre.get("outputName"): + representation["context"]["output"] = repre['outputName'] + + if is_sequence_representation and repre.get("frameStart") is not None: + representation['context']['frame'] = template_data["frame"] + + return { + "representation": representation, + "anatomy_data": template_data, + "transfers": transfers, + # todo: avoid the need for 'published_files' used by Integrate Hero + # backwards compatibility + "published_files": [transfer[1] for transfer in transfers] + } + + def create_version_data(self, instance): + """Create the data dictionary for the version + + Args: + instance: the current instance being published + + Returns: + dict: the required information for version["data"] + """ + + context = instance.context + + # create relative source path for DB + if "source" in instance.data: + source = instance.data["source"] + else: + source = context.data["currentFile"] + anatomy = instance.context.data["anatomy"] + source = self.get_rootless_path(anatomy, source) + self.log.debug("Source: {}".format(source)) + + version_data = { + "families": get_instance_families(instance), + "time": context.data["time"], + "author": context.data["user"], + "source": source, + "comment": context.data.get("comment"), + "machine": context.data.get("machine"), + "fps": instance.data.get("fps", context.data.get("fps")) + } + + # todo: preferably we wouldn't need this "if dict" etc. logic and + # instead be able to rely what the input value is if it's set. + intent_value = context.data.get("intent") + if intent_value and isinstance(intent_value, dict): + intent_value = intent_value.get("value") + + if intent_value: + version_data["intent"] = intent_value + + # Include optional data if present in + optionals = [ + "frameStart", "frameEnd", "step", "handles", + "handleEnd", "handleStart", "sourceHashes" + ] + for key in optionals: + if key in instance.data: + version_data[key] = instance.data[key] + + # Include instance.data[versionData] directly + version_data_instance = instance.data.get('versionData') + if version_data_instance: + version_data.update(version_data_instance) + + return version_data + + def get_template_name(self, instance): + """Return anatomy template name to use for integration""" + # Define publish template name from profiles + filter_criteria = self.get_profile_filter_criteria(instance) + template_name_profiles = self._get_template_name_profiles(instance) + profile = filter_profiles( + template_name_profiles, + filter_criteria, + logger=self.log + ) + + if profile: + return profile["template_name"] + return self.default_template_name + + def _get_template_name_profiles(self, instance): + """Receive profiles for publish template keys. + + Reuse template name profiles from legacy integrator. Goal is to move + the profile settings out of plugin settings but until that happens we + want to be able set it at one place and don't break backwards + compatibility (more then once). + """ + + return ( + instance.context.data["project_settings"] + ["global"] + ["publish"] + ["IntegrateAssetNew"] + ["template_name_profiles"] + ) + + def get_profile_filter_criteria(self, instance): + """Return filter criteria for `filter_profiles`""" + # Anatomy data is pre-filled by Collectors + anatomy_data = instance.data["anatomyData"] + + # Task can be optional in anatomy data + task = anatomy_data.get("task", {}) + + # Return filter criteria + return { + "families": anatomy_data["family"], + "tasks": task.get("name"), + "task_types": task.get("type"), + "hosts": instance.context.data["hostName"], + } + + def get_rootless_path(self, anatomy, path): + """Returns, if possible, path without absolute portion from root + (eg. 'c:\' or '/opt/..') + + This information is platform dependent and shouldn't be captured. + Example: + 'c:/projects/MyProject1/Assets/publish...' > + '{root}/MyProject1/Assets...' + + Args: + anatomy: anatomy part from instance + path: path (absolute) + Returns: + path: modified path if possible, or unmodified path + + warning logged + """ + success, rootless_path = anatomy.find_root_template_from_path(path) + if success: + path = rootless_path + else: + self.log.warning(( + "Could not find root path for remapping \"{}\"." + " This may cause issues on farm." + ).format(path)) + return path + + def get_files_info(self, destinations, sites, anatomy): + """Prepare 'files' info portion for representations. + + Arguments: + destinations (list): List of transferred file destinations + sites (list): array of published locations + anatomy: anatomy part from instance + Returns: + output_resources: array of dictionaries to be added to 'files' key + in representation + """ + file_infos = [] + for file_path in destinations: + file_info = self.prepare_file_info(file_path, anatomy, sites=sites) + file_infos.append(file_info) + return file_infos + + def prepare_file_info(self, path, anatomy, sites): + """ Prepare information for one file (asset or resource) + + Arguments: + path: destination url of published file + anatomy: anatomy part from instance + sites: array of published locations, + [ {'name':'studio', 'created_dt':date} by default + keys expected ['studio', 'site1', 'gdrive1'] + + Returns: + dict: file info dictionary + """ + return { + "_id": ObjectId(), + "path": self.get_rootless_path(anatomy, path), + "size": os.path.getsize(path), + "hash": openpype.api.source_hash(path), + "sites": sites + } diff --git a/openpype/plugins/publish/integrate_new.py b/openpype/plugins/publish/integrate_legacy.py similarity index 99% rename from openpype/plugins/publish/integrate_new.py rename to openpype/plugins/publish/integrate_legacy.py index f870220421c..b90b61f5873 100644 --- a/openpype/plugins/publish/integrate_new.py +++ b/openpype/plugins/publish/integrate_legacy.py @@ -69,8 +69,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): "data": additional metadata for each representation. """ - label = "Integrate Asset New" - order = pyblish.api.IntegratorOrder + label = "Integrate Asset (legacy)" + # Make sure it happens after new integrator + order = pyblish.api.IntegratorOrder + 0.00001 families = ["workfile", "pointcache", "camera", @@ -101,7 +102,6 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): "source", "matchmove", "image", - "source", "assembly", "fbx", "textures", @@ -142,6 +142,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): subset_grouping_profiles = None def process(self, instance): + if instance.data.get("processedWithNewIntegrator"): + self.log.info("Instance was already processed with new integrator") + return + for ef in self.exclude_families: if ( instance.data["family"] == ef or diff --git a/openpype/plugins/publish/integrate_subset_group.py b/openpype/plugins/publish/integrate_subset_group.py new file mode 100644 index 00000000000..910cb060a63 --- /dev/null +++ b/openpype/plugins/publish/integrate_subset_group.py @@ -0,0 +1,98 @@ +"""Produces instance.data["subsetGroup"] data used during integration. + +Requires: + dict -> context["anatomyData"] *(pyblish.api.CollectorOrder + 0.49) + +Provides: + instance -> subsetGroup (str) + +""" +import pyblish.api + +from openpype.lib.profiles_filtering import filter_profiles +from openpype.lib import ( + prepare_template_data, + StringTemplate, + TemplateUnsolved +) + + +class IntegrateSubsetGroup(pyblish.api.InstancePlugin): + """Integrate Subset Group for publish.""" + + # Run after CollectAnatomyInstanceData + order = pyblish.api.IntegratorOrder - 0.1 + label = "Subset Group" + + # Attributes set by settings + subset_grouping_profiles = None + + def process(self, instance): + """Look into subset group profiles set by settings. + + Attribute 'subset_grouping_profiles' is defined by OpenPype settings. + """ + + # Skip if 'subset_grouping_profiles' is empty + if not self.subset_grouping_profiles: + return + + if instance.data.get("subsetGroup"): + # If subsetGroup is already set then allow that value to remain + self.log.debug(( + "Skipping collect subset group due to existing value: {}" + ).format(instance.data["subsetGroup"])) + return + + # Skip if there is no matching profile + filter_criteria = self.get_profile_filter_criteria(instance) + profile = filter_profiles( + self.subset_grouping_profiles, + filter_criteria, + logger=self.log + ) + + if not profile: + return + + template = profile["template"] + + fill_pairs = prepare_template_data({ + "family": filter_criteria["families"], + "task": filter_criteria["tasks"], + "host": filter_criteria["hosts"], + "subset": instance.data["subset"], + "renderlayer": instance.data.get("renderlayer") + }) + + filled_template = None + try: + filled_template = StringTemplate.format_strict_template( + template, fill_pairs + ) + except (KeyError, TemplateUnsolved): + keys = fill_pairs.keys() + self.log.warning(( + "Subset grouping failed. Only {} are expected in Settings" + ).format(','.join(keys))) + + if filled_template: + instance.data["subsetGroup"] = filled_template + + def get_profile_filter_criteria(self, instance): + """Return filter criteria for `filter_profiles`""" + # TODO: This logic is used in much more plug-ins in one way or another + # Maybe better suited for lib? + # Anatomy data is pre-filled by Collectors + anatomy_data = instance.data["anatomyData"] + + # Task can be optional in anatomy data + task = anatomy_data.get("task", {}) + + # Return filter criteria + return { + "families": anatomy_data["family"], + "tasks": task.get("name"), + "hosts": anatomy_data["app"], + "task_types": task.get("type") + } diff --git a/openpype/settings/defaults/project_settings/global.json b/openpype/settings/defaults/project_settings/global.json index 6131ea19390..e509db2791e 100644 --- a/openpype/settings/defaults/project_settings/global.json +++ b/openpype/settings/defaults/project_settings/global.json @@ -159,7 +159,27 @@ } ] }, + "IntegrateSubsetGroup": { + "subset_grouping_profiles": [ + { + "families": [], + "hosts": [], + "task_types": [], + "tasks": [], + "template": "" + } + ] + }, "IntegrateAssetNew": { + "subset_grouping_profiles": [ + { + "families": [], + "hosts": [], + "task_types": [], + "tasks": [], + "template": "" + } + ], "template_name_profiles": [ { "families": [], @@ -202,17 +222,11 @@ "tasks": [], "template_name": "maya2unreal" } - ], - "subset_grouping_profiles": [ - { - "families": [], - "hosts": [], - "task_types": [], - "tasks": [], - "template": "" - } ] }, + "IntegrateAsset": { + "skip_host_families": [] + }, "IntegrateHeroVersion": { "enabled": true, "optional": true, diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json index a3cbf0cfcdc..b9d0b7daba3 100644 --- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json +++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json @@ -528,21 +528,21 @@ { "type": "dict", "collapsible": true, - "key": "IntegrateAssetNew", - "label": "IntegrateAssetNew", + "key": "IntegrateSubsetGroup", + "label": "Integrate Subset Group", "is_group": true, "children": [ { "type": "list", - "key": "template_name_profiles", - "label": "Template name profiles", + "key": "subset_grouping_profiles", + "label": "Subset grouping profiles", "use_label_wrap": true, "object_type": { "type": "dict", "children": [ { "type": "label", - "label": "" + "label": "Set all published instances as a part of specific group named according to 'Template'.
Implemented all variants of placeholders [{task},{family},{host},{subset},{renderlayer}]" }, { "key": "families", @@ -572,23 +572,78 @@ }, { "type": "text", - "key": "template_name", - "label": "Template name" + "key": "template", + "label": "Template" } ] } + } + ] + }, + { + "type": "dict", + "collapsible": true, + "key": "IntegrateAssetNew", + "label": "IntegrateAsset (Legacy)", + "is_group": true, + "children": [ + { + "type": "label", + "label": "NOTE: Subset grouping profiles settings were moved to Integrate Subset Group. Please move values there." }, { "type": "list", "key": "subset_grouping_profiles", - "label": "Subset grouping profiles", + "label": "Subset grouping profiles (DEPRECATED)", + "use_label_wrap": true, + "object_type": { + "type": "dict", + "children": [ + { + "key": "families", + "label": "Families", + "type": "list", + "object_type": "text" + }, + { + "type": "hosts-enum", + "key": "hosts", + "label": "Hosts", + "multiselection": true + }, + { + "key": "task_types", + "label": "Task types", + "type": "task-types-enum" + }, + { + "key": "tasks", + "label": "Task names", + "type": "list", + "object_type": "text" + }, + { + "type": "separator" + }, + { + "type": "text", + "key": "template", + "label": "Template" + } + ] + } + }, + { + "type": "list", + "key": "template_name_profiles", + "label": "Template name profiles", "use_label_wrap": true, "object_type": { "type": "dict", "children": [ { "type": "label", - "label": "Set all published instances as a part of specific group named according to 'Template'.
Implemented all variants of placeholders [{task},{family},{host},{subset},{renderlayer}]" + "label": "" }, { "key": "families", @@ -618,8 +673,39 @@ }, { "type": "text", - "key": "template", - "label": "Template" + "key": "template_name", + "label": "Template name" + } + ] + } + } + ] + }, + { + "type": "dict", + "collapsible": true, + "key": "IntegrateAsset", + "label": "Integrate Asset", + "is_group": true, + "children": [ + { + "type": "list", + "key": "skip_host_families", + "label": "Skip hosts and families", + "use_label_wrap": true, + "object_type": { + "type": "dict", + "children": [ + { + "type": "hosts-enum", + "key": "host", + "label": "Host" + }, + { + "type": "list", + "key": "families", + "label": "Families", + "object_type": "text" } ] } diff --git a/openpype/version.py b/openpype/version.py index dd5ad974497..9dda1eacce2 100644 --- a/openpype/version.py +++ b/openpype/version.py @@ -1,3 +1,3 @@ # -*- coding: utf-8 -*- """Package declaring Pype version.""" -__version__ = "3.12.2-nightly.2" +__version__ = "3.12.2-nightly.3" diff --git a/pyproject.toml b/pyproject.toml index 95522426946..eebc8a56004 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "OpenPype" -version = "3.12.2-nightly.2" # OpenPype +version = "3.12.2-nightly.3" # OpenPype description = "Open VFX and Animation pipeline with support." authors = ["OpenPype Team "] license = "MIT License" diff --git a/start.py b/start.py index ace33ab92ad..af8b5ffd411 100644 --- a/start.py +++ b/start.py @@ -103,6 +103,9 @@ import distutils.spawn from pathlib import Path + +silent_mode = False + # OPENPYPE_ROOT is variable pointing to build (or code) directory # WARNING `OPENPYPE_ROOT` must be defined before igniter import # - igniter changes cwd which cause that filepath of this script won't lead @@ -138,40 +141,44 @@ term = blessed.Terminal() def _print(message: str): + if silent_mode: + return if message.startswith("!!! "): - print("{}{}".format(term.orangered2("!!! "), message[4:])) + print(f'{term.orangered2("!!! ")}{message[4:]}') return if message.startswith(">>> "): - print("{}{}".format(term.aquamarine3(">>> "), message[4:])) + print(f'{term.aquamarine3(">>> ")}{message[4:]}') return if message.startswith("--- "): - print("{}{}".format(term.darkolivegreen3("--- "), message[4:])) + print(f'{term.darkolivegreen3("--- ")}{message[4:]}') return if message.startswith("*** "): - print("{}{}".format(term.gold("*** "), message[4:])) + print(f'{term.gold("*** ")}{message[4:]}') return if message.startswith(" - "): - print("{}{}".format(term.wheat(" - "), message[4:])) + print(f'{term.wheat(" - ")}{message[4:]}') return if message.startswith(" . "): - print("{}{}".format(term.tan(" . "), message[4:])) + print(f'{term.tan(" . ")}{message[4:]}') return if message.startswith(" - "): - print("{}{}".format(term.seagreen3(" - "), message[7:])) + print(f'{term.seagreen3(" - ")}{message[7:]}') return if message.startswith(" ! "): - print("{}{}".format(term.goldenrod(" ! "), message[7:])) + print(f'{term.goldenrod(" ! ")}{message[7:]}') return if message.startswith(" * "): - print("{}{}".format(term.aquamarine1(" * "), message[7:])) + print(f'{term.aquamarine1(" * ")}{message[7:]}') return if message.startswith(" "): - print("{}{}".format(term.darkseagreen3(" "), message[4:])) + print(f'{term.darkseagreen3(" ")}{message[4:]}') return print(message) else: def _print(message: str): + if silent_mode: + return print(message) @@ -187,9 +194,8 @@ def _print(message: str): if "--headless" in sys.argv: os.environ["OPENPYPE_HEADLESS_MODE"] = "1" sys.argv.remove("--headless") -else: - if os.getenv("OPENPYPE_HEADLESS_MODE") != "1": - os.environ.pop("OPENPYPE_HEADLESS_MODE", None) +elif os.getenv("OPENPYPE_HEADLESS_MODE") != "1": + os.environ.pop("OPENPYPE_HEADLESS_MODE", None) # Enabled logging debug mode when "--debug" is passed if "--verbose" in sys.argv: @@ -203,8 +209,8 @@ def _print(message: str): value = sys.argv.pop(idx) else: raise RuntimeError(( - "Expect value after \"--verbose\" argument. {}" - ).format(expected_values)) + f"Expect value after \"--verbose\" argument. {expected_values}" + )) log_level = None low_value = value.lower() @@ -225,8 +231,9 @@ def _print(message: str): if log_level is None: raise RuntimeError(( - "Unexpected value after \"--verbose\" argument \"{}\". {}" - ).format(value, expected_values)) + "Unexpected value after \"--verbose\" " + f"argument \"{value}\". {expected_values}" + )) os.environ["OPENPYPE_LOG_LEVEL"] = str(log_level) @@ -242,13 +249,14 @@ def _print(message: str): get_openpype_global_settings, get_openpype_path_from_settings, validate_mongo_connection, - OpenPypeVersionNotFound + OpenPypeVersionNotFound, + OpenPypeVersionIncompatible ) # noqa from igniter.bootstrap_repos import OpenPypeVersion # noqa: E402 bootstrap = BootstrapRepos() silent_commands = {"run", "igniter", "standalonepublisher", - "extractenvironments"} + "extractenvironments", "version"} def list_versions(openpype_versions: list, local_version=None) -> None: @@ -333,34 +341,33 @@ def run_disk_mapping_commands(settings): destination = destination.rstrip('/') source = source.rstrip('/') - if low_platform == "windows": - args = ["subst", destination, source] - elif low_platform == "darwin": - scr = "do shell script \"ln -s {} {}\" with administrator privileges".format(source, destination) # noqa: E501 + if low_platform == "darwin": + scr = f'do shell script "ln -s {source} {destination}" with administrator privileges' # noqa + args = ["osascript", "-e", scr] + elif low_platform == "windows": + args = ["subst", destination, source] else: args = ["sudo", "ln", "-s", source, destination] - _print("disk mapping args:: {}".format(args)) + _print(f"*** disk mapping arguments: {args}") try: if not os.path.exists(destination): output = subprocess.Popen(args) if output.returncode and output.returncode != 0: - exc_msg = "Executing was not successful: \"{}\"".format( - args) + exc_msg = f'Executing was not successful: "{args}"' raise RuntimeError(exc_msg) except TypeError as exc: - _print("Error {} in mapping drive {}, {}".format(str(exc), - source, - destination)) + _print( + f"Error {str(exc)} in mapping drive {source}, {destination}") raise def set_avalon_environments(): """Set avalon specific environments. - These are non modifiable environments for avalon workflow that must be set + These are non-modifiable environments for avalon workflow that must be set before avalon module is imported because avalon works with globals set with environment variables. """ @@ -505,7 +512,7 @@ def _process_arguments() -> tuple: ) if m and m.group('version'): use_version = m.group('version') - _print(">>> Requested version [ {} ]".format(use_version)) + _print(f">>> Requested version [ {use_version} ]") if "+staging" in use_version: use_staging = True break @@ -611,8 +618,8 @@ def _determine_mongodb() -> str: try: openpype_mongo = bootstrap.secure_registry.get_item( "openPypeMongo") - except ValueError: - raise RuntimeError("Missing MongoDB url") + except ValueError as e: + raise RuntimeError("Missing MongoDB url") from e return openpype_mongo @@ -684,40 +691,47 @@ def _find_frozen_openpype(use_version: str = None, # Specific version is defined if use_version.lower() == "latest": # Version says to use latest version - _print("Finding latest version defined by use version") + _print(">>> Finding latest version defined by use version") openpype_version = bootstrap.find_latest_openpype_version( - use_staging + use_staging, compatible_with=installed_version ) else: - _print("Finding specified version \"{}\"".format(use_version)) + _print(f">>> Finding specified version \"{use_version}\"") openpype_version = bootstrap.find_openpype_version( use_version, use_staging ) if openpype_version is None: raise OpenPypeVersionNotFound( - "Requested version \"{}\" was not found.".format( - use_version - ) + f"Requested version \"{use_version}\" was not found." ) + if not openpype_version.is_compatible(installed_version): + raise OpenPypeVersionIncompatible(( + f"Requested version \"{use_version}\" is not compatible " + f"with installed version \"{installed_version}\"" + )) + elif studio_version is not None: # Studio has defined a version to use - _print("Finding studio version \"{}\"".format(studio_version)) + _print(f">>> Finding studio version \"{studio_version}\"") openpype_version = bootstrap.find_openpype_version( - studio_version, use_staging + studio_version, use_staging, compatible_with=installed_version ) if openpype_version is None: raise OpenPypeVersionNotFound(( - "Requested OpenPype version \"{}\" defined by settings" + "Requested OpenPype version " + f"\"{studio_version}\" defined by settings" " was not found." - ).format(studio_version)) + )) else: # Default behavior to use latest version - _print("Finding latest version") + _print(( + ">>> Finding latest version compatible " + f"with [ {installed_version} ]")) openpype_version = bootstrap.find_latest_openpype_version( - use_staging + use_staging, compatible_with=installed_version ) if openpype_version is None: if use_staging: @@ -798,7 +812,7 @@ def _bootstrap_from_code(use_version, use_staging): if getattr(sys, 'frozen', False): local_version = bootstrap.get_version(Path(_openpype_root)) - switch_str = f" - will switch to {use_version}" if use_version else "" + switch_str = f" - will switch to {use_version}" if use_version and use_version != local_version else "" # noqa _print(f" - booting version: {local_version}{switch_str}") assert local_version else: @@ -813,11 +827,8 @@ def _bootstrap_from_code(use_version, use_staging): use_version, use_staging ) if version_to_use is None: - raise OpenPypeVersionNotFound( - "Requested version \"{}\" was not found.".format( - use_version - ) - ) + raise OpenPypeVersionIncompatible( + f"Requested version \"{use_version}\" was not found.") else: # Staging version should be used version_to_use = bootstrap.find_latest_openpype_version( @@ -903,7 +914,7 @@ def _boot_validate_versions(use_version, local_version): use_version, openpype_versions ) valid, message = bootstrap.validate_openpype_version(version_path) - _print("{}{}".format(">>> " if valid else "!!! ", message)) + _print(f'{">>> " if valid else "!!! "}{message}') def _boot_print_versions(use_staging, local_version, openpype_root): @@ -914,13 +925,24 @@ def _boot_print_versions(use_staging, local_version, openpype_root): _print("--- This will list only staging versions detected.") _print(" To see other version, omit --use-staging argument.") - openpype_versions = bootstrap.find_openpype(include_zips=True, - staging=use_staging) if getattr(sys, 'frozen', False): local_version = bootstrap.get_version(Path(openpype_root)) else: local_version = OpenPypeVersion.get_installed_version_str() + compatible_with = OpenPypeVersion(version=local_version) + if "--all" in sys.argv: + compatible_with = None + _print("--- Showing all version (even those not compatible).") + else: + _print(("--- Showing only compatible versions " + f"with [ {compatible_with.major}.{compatible_with.minor} ]")) + + openpype_versions = bootstrap.find_openpype( + include_zips=True, + staging=use_staging, + compatible_with=compatible_with) + list_versions(openpype_versions, local_version) @@ -937,6 +959,9 @@ def _boot_handle_missing_version(local_version, use_staging, message): def boot(): """Bootstrap OpenPype.""" + global silent_mode + if any(arg in silent_commands for arg in sys.argv): + silent_mode = True # ------------------------------------------------------------------------ # Set environment to OpenPype root path @@ -1040,7 +1065,7 @@ def boot(): if not result[0]: _print(f"!!! Invalid version: {result[1]}") sys.exit(1) - _print(f"--- version is valid") + _print("--- version is valid") else: try: version_path = _bootstrap_from_code(use_version, use_staging) @@ -1157,8 +1182,7 @@ def get_info(use_staging=None) -> list: formatted = [] for info in inf: padding = (maximum - len(info[0])) + 1 - formatted.append( - "... {}:{}[ {} ]".format(info[0], " " * padding, info[1])) + formatted.append(f'... {info[0]}:{" " * padding}[ {info[1]} ]') return formatted diff --git a/tools/create_zip.py b/tools/create_zip.py index 2fc351469a8..6392428f582 100644 --- a/tools/create_zip.py +++ b/tools/create_zip.py @@ -61,7 +61,7 @@ def _print(msg: str, message_type: int = 0) -> None: else: header = term.darkolivegreen3("--- ") - print("{}{}".format(header, msg)) + print(f"{header}{msg}") if __name__ == "__main__": diff --git a/website/src/css/custom.css b/website/src/css/custom.css index e8dd86256b9..58c9305bc7d 100644 --- a/website/src/css/custom.css +++ b/website/src/css/custom.css @@ -196,12 +196,12 @@ html[data-theme='dark'] .header-github-link::before { padding: 20px } -.showcase .client { +.showcase .studio { display: flex; justify-content: space-between; } -.showcase .client img { +.showcase .studio img { max-height: 110px; padding: 20px; max-width: 160px; diff --git a/website/src/pages/index.js b/website/src/pages/index.js index 08867060155..52302ec2854 100644 --- a/website/src/pages/index.js +++ b/website/src/pages/index.js @@ -65,13 +65,17 @@ const collab = [ image: '/img/clothcat.png', infoLink: 'https://www.clothcatanimation.com/' }, { - title: 'Ellipse Studio', - image: '/img/ellipse-studio.png', - infoLink: 'http://www.dargaudmedia.com' + title: 'Ellipse Animation', + image: '/img/ellipse_animation.svg', + infoLink: 'http://www.ellipseanimation.com' }, { title: 'J Cube Inc', image: '/img/jcube_logo_bw.png', infoLink: 'https://j-cube.jp' + }, { + title: 'Normaal Animation', + image: '/img/logo_normaal.png', + infoLink: 'https://j-cube.jp' } ]; @@ -153,7 +157,32 @@ const studios = [ title: "IGG Canada", image: "/img/igg-logo.png", infoLink: "https://www.igg.com/", - } + }, + { + title: "Agora Studio", + image: "/img/agora_studio.png", + infoLink: "https://agora.studio/", + }, + { + title: "Lucan Visuals", + image: "/img/lucan_Logo_On_White-HR.png", + infoLink: "https://www.lucan.tv/", + }, + { + title: "No Ghost", + image: "/img/noghost.png", + infoLink: "https://www.noghost.co.uk/", + }, + { + title: "Static VFX", + image: "/img/staticvfx.png", + infoLink: "http://www.staticvfx.com/", + }, + { + title: "Method n Madness", + image: "/img/methodmadness.png", + infoLink: "https://www.methodnmadness.com/", +} ]; function Service({imageUrl, title, description}) { @@ -166,10 +195,10 @@ function Service({imageUrl, title, description}) { ); } -function Client({title, image, infoLink}) { +function Studio({title, image, infoLink}) { const imgUrl = useBaseUrl(image); return ( - + ); @@ -465,7 +494,7 @@ function Home() {

Studios using openPype

{studios.map((props, idx) => ( - + ))}
diff --git a/website/static/img/NoGhost_Logo_black.svg b/website/static/img/NoGhost_Logo_black.svg new file mode 100644 index 00000000000..b499b1621f4 --- /dev/null +++ b/website/static/img/NoGhost_Logo_black.svg @@ -0,0 +1,31 @@ + + + + + + + + + + + + + diff --git a/website/static/img/agora_studio.png b/website/static/img/agora_studio.png new file mode 100644 index 00000000000..48b07b87750 Binary files /dev/null and b/website/static/img/agora_studio.png differ diff --git a/website/static/img/ellipse_animation.svg b/website/static/img/ellipse_animation.svg new file mode 100644 index 00000000000..c1caaa6726e --- /dev/null +++ b/website/static/img/ellipse_animation.svg @@ -0,0 +1,9 @@ + + + + + + + + + diff --git a/website/static/img/igg-logo.png b/website/static/img/igg-logo.png index 3c7f7718f79..9fc7a7f84f1 100644 Binary files a/website/static/img/igg-logo.png and b/website/static/img/igg-logo.png differ diff --git a/website/static/img/logo_normaal.png b/website/static/img/logo_normaal.png new file mode 100644 index 00000000000..711847c9f2f Binary files /dev/null and b/website/static/img/logo_normaal.png differ diff --git a/website/static/img/lucan_Logo_On_White-HR.png b/website/static/img/lucan_Logo_On_White-HR.png new file mode 100644 index 00000000000..c86030e1e78 Binary files /dev/null and b/website/static/img/lucan_Logo_On_White-HR.png differ diff --git a/website/static/img/methodmadness.png b/website/static/img/methodmadness.png new file mode 100644 index 00000000000..9dd0681d4a0 Binary files /dev/null and b/website/static/img/methodmadness.png differ diff --git a/website/static/img/noghost.png b/website/static/img/noghost.png new file mode 100644 index 00000000000..febaedcae82 Binary files /dev/null and b/website/static/img/noghost.png differ diff --git a/website/static/img/staticvfx.png b/website/static/img/staticvfx.png new file mode 100644 index 00000000000..41efd7f1200 Binary files /dev/null and b/website/static/img/staticvfx.png differ