diff --git a/README.md b/README.md index ebf4a0f..04200c5 100644 --- a/README.md +++ b/README.md @@ -208,7 +208,7 @@ Follow the same instructions above. You shouldn't have to install mingw or modif #### Legacy users -Warning: pyEPR organization was significnatly improved in v0.8-dev (starting 2020; current branch: master \[to be made stable soon\]). If you used a previous version, you will find that all key classes have been renamed. Please, see the tutorials and docs. In the meantime, if you cannot switch yet, revert to use the stable v0.7. +Warning: pyEPR organization was significantly improved in v0.8-dev (starting 2020; current branch: master \[to be made stable soon\]). If you used a previous version, you will find that all key classes have been renamed. Please, see the tutorials and docs. In the meantime, if you cannot switch yet, revert to use the stable v0.7. # HFSS Project Setup for `pyEPR` @@ -276,7 +276,7 @@ compiler = mingw32 [build_ext] compiler = mingw32 ``` -Next, let's install qutip. You can choose to use conda intall or pip install, or pull from the git directly as done here: +Next, let's install qutip. You can choose to use conda install or pip install, or pull from the git directly as done here: ```sh conda install git pip install git+https://github.com/qutip/qutip.git diff --git a/TODO.md b/TODO.md index 205c6df..eb0db8b 100644 --- a/TODO.md +++ b/TODO.md @@ -2,7 +2,7 @@ * ./pyEPR/ansys.py * LINE 46: : Replace `win32com` with Linux compatible package. * LINE 795: : check if variable does not exist and quit if it doesn't? - * LINE 1857: : make mesh tis own class with preperties + * LINE 1857: : make mesh tis own class with properties * LINE 1980: : create Wirebond class * LINE 2012: : Add option to modify these * LINE 2376: : Add a rotated rectangle object. @@ -14,7 +14,7 @@ * ./pyEPR/ansys.py * LINE 46: : Replace `win32com` with Linux compatible package. * LINE 795: : check if variable does not exist and quit if it doesn't? - * LINE 1857: : make mesh tis own class with preperties + * LINE 1857: : make mesh tis own class with properties * LINE 1980: : create Wirebond class * LINE 2012: : Add option to modify these * LINE 2376: : Add a rotated rectangle object. @@ -25,25 +25,25 @@ * ./pyEPR/core_distributed_analysis.py * LINE 149: : turn into base class shared with analysis! - * LINE 253: : replace this method with the one below, here because osme funcs use it still + * LINE 253: : replace this method with the one below, here because some funcs use it still * LINE 339: : maybe sort column and index? # todo: maybe generalize * LINE 488: : change to integer? * LINE 548: : These should be common function to the analysis and here! * LINE 849: : Update make p saved sep. and get Q for diff materials, indep. specify in pinfo * LINE 1046: : maybe load from data_file * LINE 1064: - * LINE 1139: : Move inside of loop to funciton calle self.analyze_variation + * LINE 1139: : Move inside of loop to function calle self.analyze_variation * LINE 1247: : this should really be passed as argument to the functions rather than a * LINE 1340: : THis need to be changed, wont work in the future with updating result etc. * LINE 1513: : Move to class for reporter ? * ./pyEPR/core_quantum_analysis.py * LINE 130: : remove all copies of same data - * LINE 574: : superseed by Convert.ZPF_from_EPR + * LINE 574: : supersede by Convert.ZPF_from_EPR * LINE 607: : avoide analyzing a previously analyzed variation - * LINE 741: : actually make into dataframe with mode labela and junction labels + * LINE 741: : actually make into dataframe with mode labels and junction labels * LINE 782: : ? - * LINE 825: : shouldmove these kwargs to the config + * LINE 825: : should move these kwargs to the config * ./pyEPR/project_info.py * LINE 134: : introduce modal labels diff --git a/docs/README.md b/docs/README.md index f27e7fc..2c6b60d 100644 --- a/docs/README.md +++ b/docs/README.md @@ -44,7 +44,7 @@ Notes for developers. sphinx-apidoc -f -o source/ ../pyEPR -o source/api --no-toc -M -e make html ``` -You can alos use this to update the doc tree. +You can also use this to update the doc tree. # Updating `readthedocs.org` diff --git a/docs/source/index.rst b/docs/source/index.rst index b47fce5..3f88d12 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -21,7 +21,7 @@ Powerful, automated analysis and design of quantum microwave devices easy-to-use analysis functions and automation for the design of quantum chips based on superconducting quantum circuits, both distributed and lumped. pyEPR interfaces the classical distributed microwave analysis with that of quantum structures and Hamiltonians. It is chiefly based on the `energy participation ratio `_ approach; however, it has since v0.4 extended to cover a broad range of -design approaches. pyEPR stradels the analysis from Maxwell's to Schrodinger's equations, and converts the solutions of distributed microwve (typically eignmode simulations) +design approaches. pyEPR stradels the analysis from Maxwell's to Schrodinger's equations, and converts the solutions of distributed microwave (typically eigenmode simulations) to a fully diagonalized spectrum of the energy levels, couplings, and key parameters of a many-body quantum Hamiltonian. pyEPR contains both analytic and numeric solutions. diff --git a/docs/source/installation.rst b/docs/source/installation.rst index 491bf78..178bf86 100644 --- a/docs/source/installation.rst +++ b/docs/source/installation.rst @@ -42,7 +42,7 @@ Installing locally via pip In the future, ``pyEPR`` can be installed using the Python package manager `pip `_. -However, for the moment, we recommend a local develper instalation, which allows for fast upgrades. We are still in active development. +However, for the moment, we recommend a local developer installation, which allows for fast upgrades. We are still in active development. Perform the steps in the :ref:`install-main` section. What you could do, once you have the local clone git, is to install pyEPR locally. Navigate to the local root folder of the repo. diff --git a/docs/source/key_classes_reference.rst b/docs/source/key_classes_reference.rst index cafefd0..925033e 100644 --- a/docs/source/key_classes_reference.rst +++ b/docs/source/key_classes_reference.rst @@ -1,12 +1,12 @@ Main classes ================================= -The first main class of pyEPR is :ref:`project-info`, which instansiates and stores the Ansys interfaces classes and user-defined parameters related to the design, such as junction names and properties. +The first main class of pyEPR is :ref:`project-info`, which instantiates and stores the Ansys interfaces classes and user-defined parameters related to the design, such as junction names and properties. -The second main class of pyEPR is :ref:`distributed-analysis`, which performs the EPR analysis on the ansys eigenfield solutions from the fields. It saves the calculated energy participation ratios (EPRs) and realted convergences, and other paramete results. It does not calculate the Hamiltonian. +The second main class of pyEPR is :ref:`distributed-analysis`, which performs the EPR analysis on the ansys eigenfield solutions from the fields. It saves the calculated energy participation ratios (EPRs) and related convergences, and other paramete results. It does not calculate the Hamiltonian. This is left for the third class. -The third main class of pyEPR is :ref:`quantum-analysis`, which uses the EPRs and other save quantities to create and diagonalizae the Hamiltonian. +The third main class of pyEPR is :ref:`quantum-analysis`, which uses the EPRs and other save quantities to create and diagonalize the Hamiltonian. .. _project-info: diff --git a/pyEPR/README.md b/pyEPR/README.md index 1256f25..e29dd21 100644 --- a/pyEPR/README.md +++ b/pyEPR/README.md @@ -12,8 +12,8 @@ A user should not edit `_config_default.py` directly. A user should overwrite va Contains the core analysis and run functions. ##### toolbox -Module that contains key and utility modues used in pyEPR. -- plotting: useful in visualizaiton and analysis. +Module that contains key and utility modules used in pyEPR. +- plotting: useful in visualization and analysis. - pythonic: useful pythonic functions - report: used to plot reports @@ -30,7 +30,7 @@ Contributed by Phil Rheinhold. Originally part of [pyHFSS](https://github.com/Ph Updated and modified by Zlatko Minev & Zaki Leghtas. ##### numeic_diag.py -Internal use only. For numerical diagonalizaiton. +Internal use only. For numerical diagonalization. Written by Phil Rheinhold. Updated by Zlatko Minev & Lysander Christakis. This file is tricky, use caution to modify. \ No newline at end of file diff --git a/pyEPR/__config_user_old.py b/pyEPR/__config_user_old.py index 76c683d..1e90320 100644 --- a/pyEPR/__config_user_old.py +++ b/pyEPR/__config_user_old.py @@ -36,7 +36,7 @@ th=3e-9, # Surface dielectric (dirt) constant - # units: relative permitivity + # units: relative permittivity eps_r=10, # Surface dielectric (dirt) loss tangent @@ -58,7 +58,7 @@ ansys=Dict( # method_calc_P_mj sets the method used to calculate the participation ratio in eigenmode. - # Valud values: + # Valid values: # 'line_voltage' : Uses the line voltage integral # 'J_surf_mag' : takes the avg. Jsurf over the rect. Make sure you have seeded # lots of tets here. I recommend starting with 4 across smallest dimension. @@ -71,7 +71,7 @@ ), plotting=Dict( - # Default color map for plottng. Better if made into a string name + # Default color map for plotting. Better if made into a string name # taken from matplotlib.cm default_color_map='viridis', # pylint: disable=no-member ), diff --git a/pyEPR/__init__.py b/pyEPR/__init__.py index 091cac2..1c76675 100644 --- a/pyEPR/__init__.py +++ b/pyEPR/__init__.py @@ -147,7 +147,7 @@ logger.warning( """IMPORT WARNING: Python package 'pythoncom' could not be loaded - It is used in communicting with HFSS on PCs. If you wish to do this, please set it up. + It is used in communicating with HFSS on PCs. If you wish to do this, please set it up. For Linux, check the HFSS python linux files for the com module used. It is equivalent, and can be used just as well. %s""", config.internal.error_msg_missing_import) @@ -168,7 +168,7 @@ except (ImportError, ModuleNotFoundError): logger.error( """IMPORT ERROR: - Python package 'pint' could not be loaded. It is used in communicting with HFSS. Try: + Python package 'pint' could not be loaded. It is used in communicating with HFSS. Try: $ conda install -c conda-forge pint \n%s""", config.internal.error_msg_missing_import) @@ -185,7 +185,7 @@ from .ansys import parse_units, parse_units_user, parse_entry from .core import ProjectInfo, DistributedAnalysis, QuantumAnalysis,\ - Project_Info, pyEPR_HFSSAnalysis, pyEPR_Analysis # names to be depricated + Project_Info, pyEPR_HFSSAnalysis, pyEPR_Analysis # names to be deprecated __all__ = [ 'logger', @@ -199,7 +199,7 @@ 'QuantumAnalysis', 'Project_Info', 'pyEPR_HFSSAnalysis', - 'pyEPR_Analysis', # names to be depricated + 'pyEPR_Analysis', # names to be deprecated 'parse_units', 'parse_units_user', 'parse_entry' diff --git a/pyEPR/_config_default.py b/pyEPR/_config_default.py index a879ee2..9438672 100644 --- a/pyEPR/_config_default.py +++ b/pyEPR/_config_default.py @@ -26,7 +26,7 @@ ansys=Dict( # method_calc_P_mj sets the method used to calculate the participation ratio in eigenmode. - # Valud values: + # Valid values: # 'line_voltage' : Uses the line voltage integral # 'J_surf_mag' : takes the avg. Jsurf over the rect. Make sure you have seeded # lots of tets here. I recommend starting with 4 across smallest dimension. @@ -42,14 +42,14 @@ epr = Dict( - # Define the participation renomalizaiton method + # Define the participation renormalization method # False : no extra renormalization to enforce # can be more problematic for large pj, when sim isn't well converged # True or 1 : use enforcement of U_J_total to be U_mode-U_H # can be more problematic for small pj, when sim isn't well converged # 2 : use enforcement of U_J_total to be U_mode-U_H (i.e., 1) - # only when the total particiaption is above a certain threshold - # preffered method. + # only when the total participation is above a certain threshold + # preferred method. renorm_pj = 2, ), @@ -72,7 +72,7 @@ th=3e-9, # Surface dielectric (dirt) constant - # units: relative permitivity + # units: relative permittivity eps_r=10, # Surface dielectric (dirt) loss tangent @@ -93,7 +93,7 @@ ), plotting=Dict( - # Default color map for plottng. Better if made into a string name + # Default color map for plotting. Better if made into a string name # taken from matplotlib.cm default_color_map='viridis', # pylint: disable=no-member ), @@ -147,7 +147,7 @@ def update_recursive(d:collections.abc.Mapping, u:collections.abc.Mapping): Arguments: d {collections.abc.Mapping} -- dict to overwrite - u {collections.abc.Mapping} -- dcit used to update + u {collections.abc.Mapping} -- dict used to update Returns: same as d; Updated d @@ -162,11 +162,11 @@ def update_recursive(d:collections.abc.Mapping, u:collections.abc.Mapping): def get_config(): """Returns the config pointer. - If the config is not yet loaded, it will load the defualt config and then + If the config is not yet loaded, it will load the default config and then update it with the _config_user.config dictionary. Else, it will just return the pointer to the above-updated config, which the - user could have modified. The modificaitons will be kept. + user could have modified. The modifications will be kept. Returns: Dict : the config dictionary diff --git a/pyEPR/_config_user.py b/pyEPR/_config_user.py index c1334ce..0ed2961 100644 --- a/pyEPR/_config_user.py +++ b/pyEPR/_config_user.py @@ -42,7 +42,7 @@ th=3e-9, # Surface dielectric (dirt) constant - # units: relative permitivity + # units: relative permittivity eps_r=10, # Surface dielectric (dirt) loss tangent @@ -64,7 +64,7 @@ ansys=Dict( # method_calc_P_mj sets the method used to calculate the participation ratio in eigenmode. - # Valud values: + # Valid values: # 'line_voltage' : Uses the line voltage integral # 'J_surf_mag' : takes the avg. Jsurf over the rect. Make sure you have seeded # lots of tets here. I recommend starting with 4 across smallest dimension. @@ -77,7 +77,7 @@ ), plotting=Dict( - # Default color map for plottng. Better if made into a string name + # Default color map for plotting. Better if made into a string name # taken from matplotlib.cm default_color_map='viridis', # pylint: disable=no-member ), diff --git a/pyEPR/ansys.py b/pyEPR/ansys.py index 595c960..bd25d63 100644 --- a/pyEPR/ansys.py +++ b/pyEPR/ansys.py @@ -140,14 +140,14 @@ def parse_entry(entry, convert_to_unit=LENGTH_UNIT): def fix_units(x, unit_assumed=None): ''' Convert all numbers to string and append the assumed units if needed. - For an itterable, returns a list + For an iterable, returns a list ''' unit_assumed = LENGTH_UNIT_ASSUMED if unit_assumed is None else unit_assumed if isinstance(x, str): # Check if there are already units defined, assume of form 2.46mm or 2.0 or 4. if x[-1].isdigit() or x[-1] == '.': # number return x + unit_assumed - else: # units are already appleid + else: # units are already applied return x elif isinstance(x, Number): @@ -187,7 +187,7 @@ def unparse_units(x): def parse_units_user(x): ''' - Convert from user assuemd units to user assumed units + Convert from user assumed units to user assumed units [USER UNITS] ----> [USER UNITS] ''' return parse_entry(fix_units(x, LENGTH_UNIT_ASSUMED), LENGTH_UNIT_ASSUMED) @@ -349,7 +349,7 @@ def set_property(prop_holder, value, prop_args=None): ''' - More general non obj oriented, functionatl verison + More general non obj oriented, functional version prop_args = [] by default ''' if not isinstance(prop_server, list): @@ -625,11 +625,11 @@ def __init__(self, project, design): self._ansys_version = self.parent._ansys_version try: - # This funciton does not exist if the desing is not HFSS + # This function does not exist if the design is not HFSS self.solution_type = design.GetSolutionType() except Exception as e: logger.debug( - f'Exception occured at design.GetSolutionType() {e}. Assuming Q3D design' + f'Exception occurred at design.GetSolutionType() {e}. Assuming Q3D design' ) self.solution_type = 'Q3D' @@ -662,7 +662,7 @@ def add_message(self, message: str, severity: int = 0): def save_screenshot(self, path: str = None, show: bool = True): if not path: - path = Path().absolute() / 'ansys.png' # TODOL find better + path = Path().absolute() / 'ansys.png' # TODO find better self._modeler.ExportModelImageToFile( str(path), 0, @@ -872,10 +872,10 @@ def _variation_string_to_variable_list(self, def set_variables(self, variation_string: str): """ - Set all variables to match a solved variaiton string. + Set all variables to match a solved variation string. Args: - variation_string (str) : Variaiton string such as + variation_string (str) : Variation string such as "Cj='2fF' Lj='13.5nH'" """ assert isinstance(variation_string, str) @@ -913,7 +913,7 @@ def set_variable(self, name: str, value: str, postprocessing=False): value {str} -- Value, such as '10nH' Keyword Arguments: - postprocessing {bool} -- Postprocessingh variable only or not. + postprocessing {bool} -- Postprocessing variable only or not. (default: {False}) Returns: @@ -1034,7 +1034,7 @@ def analyze(self, name=None): Return Value: None ----------------------------------------------------- - Will block the until the analysis is completly done. + Will block the until the analysis is completely done. Will raise a com_error if analysis is aborted in HFSS. ''' if name is None: @@ -1093,7 +1093,7 @@ def insert_sweep(self, "ExtrapToDC:=", False, ] - # not sure hwen extacyl this changed between 2016 and 2019 + # not sure when exactly this changed between 2016 and 2019 if self._ansys_version >= '2019': if count: params.extend([ @@ -1274,7 +1274,7 @@ def get_profile(self, variation=""): skipfooter=1, skip_blank_lines=True, engine='python') - # just borken down by new lines + # just broken down by new lines return df def get_fields(self): @@ -1353,7 +1353,7 @@ class AnsysQ3DSetup(HfssSetup): min_pass = make_int_prop("Min. Number of Passes") pct_error = make_int_prop("Percent Error") frequency = make_str_prop("Adaptive Freq", 'General') # e.g., '5GHz' - n_modes = 0 # for compatability with eigenmode + n_modes = 0 # for compatibility with eigenmode def get_frequency_Hz(self): return int(ureg(self.frequency).to('Hz').magnitude) @@ -1376,7 +1376,7 @@ def get_matrix( pass_number=0, frequency=None, MatrixType='Maxwell', - solution_kind='LastAdaptive', # AdpativePass + solution_kind='LastAdaptive', # AdaptivePass ACPlusDCResistance=False, soln_type="C"): ''' @@ -1464,7 +1464,7 @@ def _readin_Q3D_matrix(path: str): text = Path(path).read_text() s1 = text.split('Capacitance Matrix') - assert len(s1) == 2, "Copuld not split text to `Capacitance Matrix`" + assert len(s1) == 2, "Could not split text to `Capacitance Matrix`" s2 = s1[1].split('Conductance Matrix') @@ -1484,7 +1484,7 @@ def _readin_Q3D_matrix(path: str): df_cond = None var = re.findall(r'DesignVariation:(.*?)\n', - text) # this changed circe v2020 + text) # this changed circa v2020 if len(var) < 1: # didnt find var = re.findall(r'Design Variation:(.*?)\n', text) if len(var) < 1: # didnt find @@ -1499,7 +1499,7 @@ def _readin_Q3D_matrix(path: str): @staticmethod def load_q3d_matrix(path, user_units='fF'): - """Load Q3D capcitance file exported as Maxwell matrix. + """Load Q3D capacitance file exported as Maxwell matrix. Exports also conductance conductance. Units are read in automatically and converted to user units. @@ -1595,7 +1595,7 @@ def eigenmodes(self, lv=""): """ Export eigenmodes vs pass number - Did not figre out how to set pass number in a hurry. + Did not figure out how to set pass number in a hurry. import tempfile @@ -1621,7 +1621,7 @@ def eigenmodes(self, lv=""): soln_name = f'{setup.name} : AdaptivePas' available_solns = self._solutions.GetValidISolutionList() if not(soln_name in available_solns): - logger.error(f'ERROR Tried to export freq vs pass number, but solution `{soln_name}` was not in avaialbe `{available_solns}`. Returning []') + logger.error(f'ERROR Tried to export freq vs pass number, but solution `{soln_name}` was not in available `{available_solns}`. Returning []') #return [] self._solutions.ExportEigenmodes(soln_name, ['Pass:=5'], fn) # ['Pass:=5'] fails can do with '' """ @@ -1635,7 +1635,7 @@ def set_mode(self, n, phase=0, FieldType='EigenStoredEnergy'): Amplitude is set to 1 - No error is thorwn if a number exceeding number of modes is set + No error is thrown if a number exceeding number of modes is set FieldType -- EigenStoredEnergy or EigenPeakElecticField ''' @@ -1681,7 +1681,7 @@ def has_fields(self, variation_string=None): Determine if fields exist for a particular solution. variation_string : str | None - This must the string that describes the variaiton in hFSS, not 0 or 1, but + This must the string that describes the variation in hFSS, not 0 or 1, but the string of variables, such as "Cj='2fF' Lj='12.75nH'" If None, gets the nominal variation @@ -1704,7 +1704,7 @@ def create_report(self, Example ------------------------------------------------------ - Exammple plot for a single vareiation all pass converge of mode freq + Example plot for a single variation all pass converge of mode freq .. code-block python ycomp = [f"re(Mode({i}))" for i in range(1,1+epr_hfss.n_modes)] params = ["Pass:=", ["All"]]+variation @@ -2035,7 +2035,7 @@ def mesh_get_names(self, kind="Length Based"): return list(self._mesh.GetOperationNames(kind)) def mesh_get_all_props(self, mesh_name): - # TODO: make mesh tis own class with preperties + # TODO: make mesh tis own class with properties prop_tab = 'MeshSetupTab' prop_server = f'MeshSetup:{mesh_name}' prop_names = self.parent._design.GetProperties('MeshSetupTab', @@ -2185,12 +2185,12 @@ def draw_wirebond(self, **kwargs): ''' Args: - pos: 2D positon vector (specify center point) + pos: 2D position vector (specify center point) ori: should be normed - z: z postion + z: z position # TODO create Wirebond class - psoition is the origin of one point + position is the origin of one point ori is the orientation vector, which gets normalized ''' p = np.array(pos) @@ -2261,7 +2261,7 @@ def get_boundary_assignment(self, boundary_name: str): def append_PerfE_assignment(self, boundary_name: str, object_names: list): ''' This will create a new boundary if need, and will - otherwise append given names to an exisiting boundary + otherwise append given names to an existing boundary ''' # enforce boundary_name = str(boundary_name) @@ -2286,7 +2286,7 @@ def append_mesh(self, mesh_name: str, object_names: list, old_objs: list, **kwargs): ''' This will create a new boundary if need, and will - otherwise append given names to an exisiting boundary + otherwise append given names to an existing boundary old_obj = circ._mesh_assign ''' mesh_name = str(mesh_name) @@ -2326,7 +2326,7 @@ def _make_lumped_rlc(self, r, l, c, start, end, obj_arr, name="LumpRLC"): params += obj_arr params.append([ "NAME:CurrentLine", - # for some reason here it seems to swtich to use the model units, rather than meters + # for some reason here it seems to switch to use the model units, rather than meters "Start:=", fix_units(start, unit_assumed=LENGTH_UNIT), "End:=", @@ -2422,7 +2422,7 @@ def create_relative_coorinate_system_both(self, Modeler>Coordinate System>Create>Relative CS->Rotated Modeler>Coordinate System>Create>Relative CS->Both - Current cooridnate system is set right after this. + Current coordinate system is set right after this. cs_name : name of coord. sys If the name already exists, then a new coordinate system with _1 is created. @@ -2708,9 +2708,9 @@ def rename(self, new_name): ''' new_name = increment_name( new_name, self.modeler.get_objects_in_group( - "Sheets")) # this is for a clsoed polyline + "Sheets")) # this is for a closed polyline - # check to get the actual new name in case there was a suibtracted ibjet with that namae + # check to get the actual new name in case there was a substracted object with that name face_ids = self.modeler.get_face_ids(str(self)) self.modeler.rename_obj(self, new_name) # now rename if len(face_ids) > 0: @@ -2750,7 +2750,7 @@ def fillet(self, radius, vertex_index): def fillets(self, radius, do_not_fillet=[]): ''' - do_not_fillet : Index list of verteces to not fillete + do_not_fillet : Index list of vertices to not fillete ''' raw_list_vertices = self.modeler.get_vertex_ids(self) list_vertices = [] @@ -2816,7 +2816,7 @@ def clear_named_expressions(self): def declare_named_expression(self, name): """" - If a named epression has been created in the fields calculator, this + If a named expression has been created in the fields calculator, this function can be called to initialize the name to work with the fields object """ self.named_expression[name] = NamedCalcObject(name, self.setup) @@ -3066,7 +3066,7 @@ def get_active_project(): except AttributeError: is_admin = ctypes.windll.shell32.IsUserAnAdmin() != 0 if not is_admin: - print('\033[93m WARNING: you are not runnning as an admin! \ + print('\033[93m WARNING: you are not running as an admin! \ You need to run as an admin. You will probably get an error next.\ \033[0m') @@ -3103,7 +3103,7 @@ def load_ansys_project(proj_name: str, # Checks assert project_path.is_dir( ), "ERROR! project_path is not a valid directory \N{loudly crying face}.\ - Check the path, and especially \\ charecters." + Check the path, and especially \\ characters." project_path /= project_path / Path(proj_name + extension) diff --git a/pyEPR/calcs/back_box_numeric.py b/pyEPR/calcs/back_box_numeric.py index 57e904f..9371819 100644 --- a/pyEPR/calcs/back_box_numeric.py +++ b/pyEPR/calcs/back_box_numeric.py @@ -46,11 +46,11 @@ def epr_numerical_diagonalization(freqs, Ljs, ϕzpf, return_H=False, non_linear_potential=None): ''' - Numerical diagonalizaiton for pyEPR. Ask Zlatko for details. + Numerical diagonalization for pyEPR. Ask Zlatko for details. :param fs: (GHz, not radians) Linearized model, H_lin, normal mode frequencies in Hz, length M - :param ljs: (Henries) junction linerized inductances in Henries, length J - :param fzpfs: (reduced) Reduced Zero-point fluctutation of the junction fluxes for each mode + :param ljs: (Henries) junction linearized inductances in Henries, length J + :param fzpfs: (reduced) Reduced Zero-point fluctuation of the junction fluxes for each mode across each junction, shape MxJ :return: Hamiltonian mode freq and dispersive shifts. Shifts are in MHz. @@ -79,8 +79,8 @@ def black_box_hamiltonian(fs, ljs, fzpfs, cos_trunc=5, fock_trunc=8, individual= non_linear_potential = None): r""" :param fs: Linearized model, H_lin, normal mode frequencies in Hz, length N - :param ljs: junction linerized inductances in Henries, length M - :param fzpfs: Zero-point fluctutation of the junction fluxes for each mode across each junction, + :param ljs: junction linearized inductances in Henries, length M + :param fzpfs: Zero-point fluctuation of the junction fluxes for each mode across each junction, shape MxJ :return: Hamiltonian in units of Hz (i.e H / h) All in SI units. The ZPF fed in are the generalized, not reduced, flux. @@ -140,7 +140,7 @@ def make_dispersive(H, fock_trunc, fzpfs=None, f0s=None, chi_prime=False, use_1st_order=False): r""" Input: Hamiltonian Matrix. - Optional: phi_zpfs and normal mode frequncies, f0s. + Optional: phi_zpfs and normal mode frequencies, f0s. use_1st_order : deprecated Output: Return dressed mode frequencies, chis, chi prime, phi_zpf flux (not reduced), and linear frequencies @@ -275,7 +275,7 @@ def black_box_hamiltonian_nq(freqs, zmat, ljs, cos_trunc=6, fock_trunc=8, show_f slopes = np.zeros((nj, nz)) import matplotlib.pyplot as plt # Fit a second order polynomial in the region around the zero - # Extract the exact location of the zero and the assocated slope + # Extract the exact location of the zero and the associated slope # If you need better than second order fit, you're not sampling finely enough for i, z in enumerate(zeros): f0_guess = (freqs[z+1] + freqs[z]) / 2 diff --git a/pyEPR/calcs/basic.py b/pyEPR/calcs/basic.py index 2d35a59..2aac35b 100644 --- a/pyEPR/calcs/basic.py +++ b/pyEPR/calcs/basic.py @@ -13,7 +13,7 @@ def epr_to_zpf(Pmj, SJ, Ω, EJ): r''' INPUTS: All as matrices (numpy arrays) - :Pnj: MxJ energy-participatuion-ratio matrix, p_mj + :Pnj: MxJ energy-participation-ratio matrix, p_mj :SJ: MxJ sign matrix, s_mj :Ω: MxM diagonal matrix of frequencies (GHz, not radians, diagonal) :EJ: JxJ diagonal matrix matrix of Josephson energies (in same units as Om) @@ -32,7 +32,7 @@ def epr_to_zpf(Pmj, SJ, Ω, EJ): {Pmj}""") # Technically, there the equation is hbar omega / 2J, but here we assume - # that the hbar is absrobed in the units of omega, and omega and Ej have the same units. + # that the hbar is absorbed in the units of omega, and omega and Ej have the same units. # PHI=np.zeros((3,3)) # for m in range(3): # for j in range(3): @@ -43,7 +43,7 @@ def epr_to_zpf(Pmj, SJ, Ω, EJ): @staticmethod def epr_cap_to_nzpf(Pmj_cap, SJ, Ω, Ec): """ - Expeirmental. To be tested + Experimental. To be tested """ (Pmj, SJ, Ω, EJ) = map(np.array, (Pmj_cap, SJ, Ω, Ec)) return SJ * sqrt(Ω @ Pmj @ np.linalg.inv(Ec) /(4*4)) diff --git a/pyEPR/calcs/constants.py b/pyEPR/calcs/constants.py index ce43085..211efb7 100644 --- a/pyEPR/calcs/constants.py +++ b/pyEPR/calcs/constants.py @@ -1,5 +1,5 @@ """ -pyEPR constants and convinience definitions. +pyEPR constants and convenience definitions. @author: Zlatko Minev """ diff --git a/pyEPR/calcs/convert.py b/pyEPR/calcs/convert.py index 676a554..c78bd5e 100644 --- a/pyEPR/calcs/convert.py +++ b/pyEPR/calcs/convert.py @@ -204,7 +204,7 @@ def ZPF_from_EPR(hfss_freqs, hfss_epr_, hfss_signs, hfss_Ljs, Returns: M x J matrix of reduced ZPF; i.e., scaled by reduced flux quantum. type: np.array - and a tuple of matricies. + and a tuple of matrices. Example use: ϕzpf, (Ωm, Ej, Pmj, Smj) = Convert.ZPF_from_EPR(hfss_freqs, hfss_epr, hfss_signs, hfss_Ljs, to_df=True) diff --git a/pyEPR/calcs/hamiltonian.py b/pyEPR/calcs/hamiltonian.py index 578c3d1..bb7b205 100644 --- a/pyEPR/calcs/hamiltonian.py +++ b/pyEPR/calcs/hamiltonian.py @@ -1,7 +1,7 @@ """ Hamiltonian and Matrix Operations. Hamiltonian operations heavily draw on qutip package. -This package must be installded for them to work. +This package must be installed for them to work. """ try: import qutip @@ -18,9 +18,9 @@ class MatrixOps(object): @staticmethod def cos(op_cos_arg: Qobj): """ - Make cosine opertor matrix from arguemnt op_cos_arg + Make cosine operator matrix from argument op_cos_arg - op_cos_arg (qutip.Qobj) : argumetn of the cosine + op_cos_arg (qutip.Qobj) : argument of the cosine """ return 0.5*((1j*op_cos_arg).expm() + (-1j*op_cos_arg).expm()) @@ -53,7 +53,7 @@ def fock_state_on(d: dict, fock_trunc: int, N_modes: int): @staticmethod def closest_state_to(s: Qobj, energyMHz, evecs): """ - Returns the enery of the closest state to s + Returns the energy of the closest state to s """ def distance(s2): return (s.dag() * s2[1]).norm() @@ -75,7 +75,7 @@ def identify_Fock_levels(fock_trunc: int, evecs, """ Return quantum numbers in terms of the undiagonalized eigenbasis. """ - # to do: need to turn Fock_max into arb algo on each mdoe + # to do: need to turn Fock_max into arb algo on each mode def fock_state_on(d): return HamOps.fock_state_on(d, fock_trunc, N_modes) diff --git a/pyEPR/calcs/quantum.py b/pyEPR/calcs/quantum.py index ea866f1..dc36d2b 100644 --- a/pyEPR/calcs/quantum.py +++ b/pyEPR/calcs/quantum.py @@ -1,6 +1,6 @@ """ Implementation of basic quantum operation in numpy, -to effortleslly remove the need in the `qutip` package. +to effortlessly remove the need in the `qutip` package. """ import numpy as np @@ -13,7 +13,7 @@ def create(n: int): return mat def destroy(n: int): - """Returns matrix representation of an n-dimensional annhilation operator""" + """Returns matrix representation of an n-dimensional annihilation operator""" diag = np.sqrt(np.arange(1, n)) mat = np.zeros([n, n]) np.fill_diagonal(mat[:, 1:], diag) diff --git a/pyEPR/calcs/transmon.py b/pyEPR/calcs/transmon.py index 931a534..c2be374 100644 --- a/pyEPR/calcs/transmon.py +++ b/pyEPR/calcs/transmon.py @@ -1,5 +1,5 @@ """ -Transmon caluclations +Transmon calculations """ import math @@ -62,7 +62,7 @@ def dispersiveH_params_PT_O1(Pmj, Ωm, Ej): def transmon_get_all_params(Ej_MHz, Ec_MHz): """ Linear harmonic oscillator approximation of transmon. - Convinince func + Convenience func """ Ej, Ec = Ej_MHz, Ec_MHz Lj_H, Cs_F = Convert.Lj_from_Ej( @@ -87,7 +87,7 @@ def transmon_get_all_params(Ej_MHz, Ec_MHz): def transmon_print_all_params(Lj_nH, Cs_fF): """ Linear harmonic oscillator approximation of transmon. - Convinince func + Convenience func """ # Parameters - duplicates with transmon_get_all_params Ej, Ec = Convert.Ej_from_Lj(Lj_nH, 'nH', 'MHz'), Convert.Ec_from_Cs( diff --git a/pyEPR/core.py b/pyEPR/core.py index 02b85ec..5a54e4a 100644 --- a/pyEPR/core.py +++ b/pyEPR/core.py @@ -1,9 +1,9 @@ """ Main interface module to use pyEPR. -Contains code to conenct to Ansys and to analyze HFSS files using the EPR method. +Contains code to connect to Ansys and to analyze HFSS files using the EPR method. -This module handles the micowave part of the analysis and conenction to +This module handles the microwave part of the analysis and connection to Further contains code to be able to do autogenerated reports, @@ -18,7 +18,7 @@ from .core_quantum_analysis import QuantumAnalysis from .core_distributed_analysis import DistributedAnalysis -# Backwards compatability. To be depreciated. +# Backwards compatibility. To be depreciated. Project_Info = ProjectInfo pyEPR_HFSSAnalysis = DistributedAnalysis pyEPR_Analysis = QuantumAnalysis diff --git a/pyEPR/core_distributed_analysis.py b/pyEPR/core_distributed_analysis.py index 7f71f01..a257dec 100644 --- a/pyEPR/core_distributed_analysis.py +++ b/pyEPR/core_distributed_analysis.py @@ -1,9 +1,9 @@ """ Main distributed analysis module to use pyEPR. -Contains code to conenct to Ansys and to analyze HFSS files using the EPR method. +Contains code to connect to Ansys and to analyze HFSS files using the EPR method. -This module handles the micowave part of the analysis and conenction to +This module handles the microwave part of the analysis and connection to Further contains code to be able to do autogenerated reports, @@ -50,12 +50,12 @@ class DistributedAnalysis(object): """ DISTRIBUTED ANALYSIS of layout and microwave results. - Main compuation class & interface with HFSS. + Main computation class & interface with HFSS. This class defines a DistributedAnalysis object which calculates and saves Hamiltonian parameters from an HFSS simulation. - Further, it allows one to calcualte dissipation, etc. + Further, it allows one to calculate dissipation, etc. """ def __init__(self, *args, **kwargs): @@ -66,7 +66,7 @@ def __init__(self, *args, **kwargs): Parameters: ------------------- project_info : ProjectInfo - Suplpy the project info or the parameters to create pinfo + Supply the project info or the parameters to create pinfo Use notes: ------------------- @@ -105,7 +105,7 @@ def __init__(self, *args, **kwargs): eprd.do_EPR_analysis(append_analysis=True); - Key internal paramters: + Key internal parameters: ------------------- n_modes (int) : Number of eignemodes; e.g., 2 variations (List[str]) : A list of string identifier of **solved** variation @@ -127,8 +127,8 @@ def __init__(self, *args, **kwargs): project_info = args[0] else: assert len(args) == 0, '''Since you did not pass a ProjectInfo object - as a arguemnt, we now assuem you are trying to create a project - info object here by apassing its arguments. See ProjectInfo. + as a argument, we now assume you are trying to create a project + info object here by passing its arguments. See ProjectInfo. It does not take any arguments, only kwargs. \N{face with medical mask}''' project_info = ProjectInfo(*args, **kwargs) @@ -144,7 +144,7 @@ def __init__(self, *args, **kwargs): self.fields = self.setup.get_fields() self.solutions = self.setup.get_solutions() - # Stores resutls from sims + # Stores results from sims self.results = Dict() # of variations. Saved results # TODO: turn into base class shared with analysis! @@ -251,11 +251,11 @@ def calc_p_junction_single(self, mode, variation, U_E=None, U_H=None): print(' p_j_' + str(mode) + ' = ' + str(pj_val)) return pj - # TODO: replace this method with the one below, here because osme funcs use it still + # TODO: replace this method with the one below, here because some funcs use it still def get_freqs_bare(self, variation: str): """ Warning: - Outdated. Do not use. To be depreicated + Outdated. Do not use. To be deprecated Args: variation (str): A string identifier of the variation, @@ -450,7 +450,7 @@ def get_nominal_variation_index(self): def get_ansys_variations(self): """ - Will update ansys inofrmation and result the list of variations. + Will update ansys information and result the list of variations. Returns: For example: @@ -506,7 +506,7 @@ def _update_ansys_variables(self, variations=None): def get_ansys_variables(self): """ - Get ansys variables for all variaitons + Get ansys variables for all variations Returns: Return a dataframe of variables as index and columns as the variations @@ -573,7 +573,7 @@ def calc_energy_electric(self, obj_dims (int | 3) : 1 - line, 2 - surface, 3 - volume. Default volume Example: - Example use to calcualte the energy participation ratio (EPR) of a substrate + Example use to calculate the energy participation ratio (EPR) of a substrate .. code-block:: python :linenos: @@ -699,7 +699,7 @@ def calc_current(self, fields, line: str): return I def calc_avg_current_J_surf_mag(self, variation: str, junc_rect: str, junc_line): - ''' Peak current I_max for mdoe J in junction J + ''' Peak current I_max for mode J in junction J The avg. is over the surface of the junction. I.e., spatial. Args: variation (str): A string identifier of the variation, @@ -727,7 +727,7 @@ def calc_current_using_line_voltage(self, variation: str, junc_line_name: str, ''' Peak current I_max for prespecified mode calculating line voltage across junction. - Make sure that oyu have set the correct variaitonin hFSS before running this + Make sure that oyu have set the correct variation in hFSS before running this Parameters: ------------------------------------------------ @@ -797,7 +797,7 @@ def get_junc_len_dir(self, variation: str, junc_line): def get_Qseam(self, seam, mode, variation, U_H=None): r''' - Caculate the contribution to Q of a seam, by integrating the current in + Calculate the contribution to Q of a seam, by integrating the current in the seam with finite conductance: set in the config file ref: http://arxiv.org/pdf/1509.01119.pdf ''' @@ -944,7 +944,7 @@ def calc_p_junction(self, variation, U_H, U_E, Ljs, Cjs): For a single specific mode. Expected that you have specified the mode before calling this, `self.set_mode(num)` - Expected to precalc U_H and U_E for mode, will retunr pandas pd.Series object + Expected to precalc U_H and U_E for mode, will return pandas pd.Series object junc_rect = ['junc_rect1', 'junc_rect2'] name of junc rectangles to integrate H over junc_len = [0.0001] specify in SI units; i.e., meters LJs = [8e-09, 8e-09] SI units @@ -984,7 +984,7 @@ def calc_p_junction(self, variation, U_H, U_E, Ljs, Cjs): _I_peak_1 = self.calc_avg_current_J_surf_mag( variation, j_props['rect'], line_name) - # could also use this to back out the V_peak using the impedences as in the line + # could also use this to back out the V_peak using the impedances as in the line # below for now, keep both methods _I_peak_2, _V_peak_2, _ = self.calc_current_using_line_voltage( @@ -1127,14 +1127,14 @@ def do_EPR_analysis(self, Modes to analyze for example modes = [0, 2, 3] - append_analysis (bool) : When we run the ansys anslysis, should we redo any variations + append_analysis (bool) : When we run the ansys analysis, should we redo any variations that we have already done? Ansys Notes: ------------------------ Assumptions: Low dissipation (high-Q). - It is easier to assume no lumped capcitors to simply calculations, but we have + It is easier to assume no lumped capacitors to simply calculations, but we have recently added Cj_variable as a new feature that is begin tested to handle capacitors. See the paper. @@ -1184,7 +1184,7 @@ def do_EPR_analysis(self, print_NoNewLine(' previously analyzed ...\n') continue - # QUESTION! should we set the current variaiton, can this save time, set the variables + # QUESTION! should we set the current variation, can this save time, set the variables # If not, clear the results self.results[variation] = Dict() @@ -1197,9 +1197,9 @@ def do_EPR_analysis(self, continue try: - # This should allow us to load the fields only once, and then do the calcualtions - # faster. The loading of the fields does not happen here, but a tthe firc ClcEval call. - # This could fail if more varialbes are added after the simulation is compelted. + # This should allow us to load the fields only once, and then do the calculations + # faster. The loading of the fields does not happen here, but a the first ClcEval call. + # This could fail if more variables are added after the simulation is completed. self.set_variation(variation) except Exception as e: print('\tERROR: Could not set the variation string.' @@ -1267,11 +1267,11 @@ def do_EPR_analysis(self, sol = pd.Series({'U_H': self.U_H, 'U_E': self.U_E}) # Fraction - report the peak energy, properly normalized - # the 2 is from the calcualtion methods + # the 2 is from the calculation methods print(f""" {'(ℰ_E-ℰ_H)/ℰ_E':>15s} {'ℰ_E':>9s} {'ℰ_H':>9s} {100*(self.U_E - self.U_H)/self.U_E:>15.1f}% {self.U_E/2:>9.4g} {self.U_H/2:>9.4g}\n""") - # Calcualte EPR for each of the junctions + # Calculate EPR for each of the junctions print( f' Calculating junction energy participation ration (EPR)\n\tmethod=`{self.pinfo.options.method_calc_P_mj}`. First estimates:') print( @@ -1394,8 +1394,8 @@ def results_variations_on_inside(results: dict): # Conver to pandas Dataframe if all are pd.Series if all(isinstance(new_res[key][variation], pd.Series) for variation in variations): - # print(key) # Conver these to datafrme - # Variations will vecome columns + # print(key) # Conver these to dataframe + # Variations will become columns new_res[key] = pd.DataFrame(new_res[key]) new_res[key].columns.name = 'variation' # sort_df_col : maybe sort @@ -1520,15 +1520,15 @@ def set_mode(self, mode_num, phase=0): def has_fields(self, variation: str = None): ''' Determine if fields exist for a particular solution. - Just calls `self.solutions.has_fields(variaiton_string)` + Just calls `self.solutions.has_fields(variation_string)` - variation (str | None) : String of variaiton label, such as '0' or '1' + variation (str | None) : String of variation label, such as '0' or '1' If None, gets the nominal variation ''' if self.solutions: #print('variation=', variation) - variaiton_string = self.get_variation_string(variation) - return self.solutions.has_fields(variaiton_string) + variation_string = self.get_variation_string(variation) + return self.solutions.has_fields(variation_string) else: return False @@ -1599,7 +1599,7 @@ def hfss_report_full_convergence(self, fig=None, _display=True): a given variation. Makes a plot inside hfss too. Keyword Arguments: - fig {matpllitb figure} -- Optional figure (default: {None}) + fig {matplotlib figure} -- Optional figure (default: {None}) _display {bool} -- Force display or not. (default: {True}) Returns: diff --git a/pyEPR/core_quantum_analysis.py b/pyEPR/core_quantum_analysis.py index b03780e..d4f0036 100644 --- a/pyEPR/core_quantum_analysis.py +++ b/pyEPR/core_quantum_analysis.py @@ -63,7 +63,7 @@ def __init__(self, dict_file=None, data_dir=None): upgraded to the HamiltonianResultsContainer class data_dir - the directory in which the file is to be saved or loaded - from, defults to the config.root_dir + from, defaults to the config.root_dir """ super().__init__() @@ -129,7 +129,7 @@ def _inject_dic(self, add_dic): for key, val in add_dic.items(): # TODO remove all copies of same data # if key in self.keys(): - #raise ValueError('trying to overwrite an exsiting varation') + #raise ValueError('trying to overwrite an existing variation') self[str(int(key)+Init_number_of_keys)] = val return 1 @@ -141,7 +141,7 @@ def _do_sort_index(z: pd.DataFrame): z {pd.DataFrame} -- Input Returns: - Sorted DtaaFrame + Sorted DataFrame """ if isinstance(z, pd.DataFrame): return z.sort_index(axis=1) @@ -241,7 +241,7 @@ def __init__(self, data_filename, results = DistributedAnalysis.results_variations_on_inside( self.data.results) - # Convinience functions + # Convenience functions self.variations = variations or list(self.data.results.keys()) self._hfss_variables = results['hfss_variables'] self.freqs_hfss = results['freqs_hfss_GHz'] @@ -251,7 +251,7 @@ def __init__(self, data_filename, self.Cjs = results['Cjs'] # DataFrame self.OM = results['Om'] # dict of dataframes self.PM = results['Pm'] # participation matrices - raw, unnormed here - # participation matrices for capactive elements + # participation matrices for capacitive elements self.PM_cap = results['Pm_cap'] self.SM = results['Sm'] # sign matrices self.I_peak = results['I_peak'] @@ -294,7 +294,7 @@ def print_info(self): def get_vs_variable(self, swp_var, attr: str): """ - Convert the index of a dicitoanry that is stored here from + Convert the index of a dictionary that is stored here from variation number to variable value. Args: @@ -308,10 +308,10 @@ def get_vs_variable(self, swp_var, attr: str): def get_variable_vs(self, swpvar, lv=None): """ lv is list of variations (example ['0', '1']), if None it takes all variations - swpvar is the variable by which to orginize + swpvar is the variable by which to organize return: - ordered dicitonary of key which is the variation number and the magnitude + ordered dictionary of key which is the variation number and the magnitude of swaver as the item """ ret = OrderedDict() @@ -337,7 +337,7 @@ def get_variations_of_variable_value(self, swpvar, value, lv=None): has a specific value lv is list of variations (example ['0', '1']), if None it takes all variations swpvar is a string and the name of the variable we wish to filter - value is the value of swapvr in which we are intrested + value is the value of swapvr in which we are interested returns lv - a list of the variations for which swavr==value """ @@ -431,7 +431,7 @@ def get_Ejs(self, variation): def get_Ecs(self, variation): ''' ECs in GHz - Returns as padnas series + Returns as pandas series ''' Cs = self.Cjs[variation] return Convert.Ec_from_Cs(Cs, units_in='F', units_out='GHz') @@ -497,7 +497,7 @@ def _get_participation_normalized(self, variation, _renorm_pj=None, print_=False #s = self.sols[variation] # sum of participation energies as calculated by global UH and UE # U_mode = s['U_E'] # peak mode energy; or U bar as i denote it sometimes - # We need to add the capactiro here, and maybe take the mean of that + # We need to add the capacitor here, and maybe take the mean of that energies = self._get_ansys_total_energies(variation) @@ -525,7 +525,7 @@ def _get_participation_normalized(self, variation, _renorm_pj=None, print_=False idx_cap = Pm_cap > 0.15 else: raise NotImplementedError( - "Unkown _renorm_pj argument or config values!") + "Unknown _renorm_pj argument or config values!") if print_: # \nPm_cap_norm=\n{Pm_cap_norm}") @@ -560,10 +560,10 @@ def _get_participation_normalized(self, variation, _renorm_pj=None, print_=False def get_epr_base_matrices(self, variation, _renorm_pj=None, print_=False): r''' - Return the key matricies used in the EPR method for analytic calcualtions. + Return the key matrices used in the EPR method for analytic calculations. All as matrices - :PJ: Participatuion matrix, p_mj + :PJ: Participation matrix, p_mj :SJ: Sign matrix, s_mj :Om: Omega_mm matrix (in GHz) (\hbar = 1) Not radians. :EJ: E_jj matrix of Josephson energies (in same units as hbar omega matrix) @@ -573,7 +573,7 @@ def get_epr_base_matrices(self, variation, _renorm_pj=None, print_=False): Return all as *np.array* PM, SIGN, Om, EJ, Phi_ZPF ''' - # TODO: superseed by Convert.ZPF_from_EPR + # TODO: supersede by Convert.ZPF_from_EPR res = self._get_participation_normalized( variation, _renorm_pj=_renorm_pj, print_=print_) @@ -730,7 +730,7 @@ def analyze_variation(self, self.print_variation(variation) self.print_result(result) - self.n_modes = tmp_n_modes # TODO is this smart should consider defining the modes of intrest in the initilazaition of the quantum object + self.n_modes = tmp_n_modes # TODO is this smart should consider defining the modes of interest in the initialisation of the quantum object self.modes[variation]=tmp_modes return result @@ -812,7 +812,7 @@ def plotting_dic_x(self, Var_dic, var_name): dic['x_label'] = var_name dic['x'] = self.get_variable_value(var_name, lv=lv) else: - raise ValueError('more than one hfss variablae changes each time') + raise ValueError('more than one hfss variable changes each time') return lv, dic @@ -1023,11 +1023,11 @@ def get_participations(self, swp_variable='variation', _normed=True): """ - inductive (bool): EPR forjunciton inductance when True, else for capactiors + inductive (bool): EPR for junction inductance when True, else for capacitors Returns: ---------------- - Returns a multindex dataframe: + Returns a multiindex dataframe: index 0: sweep variable index 1: mode number column: junction number @@ -1177,7 +1177,7 @@ def quick_plot_mode(self, mode, junction, mode1=None, swp_variable='variation', def quick_plot_convergence(self, ax = None): """ - Plot a report of the Ansys converngece vs pass number ona twin axis + Plot a report of the Ansys convergence vs pass number ona twin axis for the number of tets and the max delta frequency of the eignemode. """ ax = ax or plt.gca() @@ -1193,7 +1193,7 @@ def quick_plot_convergence(self, ax = None): def extract_dic(name=None, file_name=None): - """#name is the name of the dictionry as saved in the npz file if it is None, + """#name is the name of the dictionary as saved in the npz file if it is None, the function will return a list of all dictionaries in the npz file file name is the name of the npz file""" with np.load(file_name, allow_pickle=True) as f: diff --git a/pyEPR/project_info.py b/pyEPR/project_info.py index c7be9ee..f916ec0 100644 --- a/pyEPR/project_info.py +++ b/pyEPR/project_info.py @@ -1,9 +1,9 @@ """ Main interface module to use pyEPR. -Contains code to conenct to Ansys and to analyze HFSS files using the EPR method. +Contains code to connect to Ansys and to analyze HFSS files using the EPR method. -This module handles the micowave part of the analysis and conenction to +This module handles the microwave part of the analysis and connection to Further contains code to be able to do autogenerated reports, @@ -36,7 +36,7 @@ class ProjectInfo(object): :py:class:`pyEPR.ansys.HfssDMSetup`, eigenmode :py:class:`pyEPR.ansys.HfssEMSetup`, or Q3D :py:class:`pyEPR.ansys.AnsysQ3DSetup`), the 3D modeler to design geometry :py:class:`pyEPR.ansys.HfssModeler`. * **Junctions:** The class stores params about the design that the user puts will use, such as the names and - properties of the junctions, such as whihc rectangle and line is associated with which junction. + properties of the junctions, such as which rectangle and line is associated with which junction. Note: @@ -55,7 +55,7 @@ class ProjectInfo(object): * ``rect`` (str): String of Ansys name of the rectangle on which the lumped boundary condition is defined. * ``line`` (str): - Name of HFSS polyline which spans the length of the recntalge. + Name of HFSS polyline which spans the length of the rectangle. Used to define the voltage across the junction. Used to define the current orientation for each junction. Used to define sign of ZPF. @@ -133,7 +133,7 @@ def __setitem__(self, key, value): def __getitem__(self, attr): if not (attr in diss_opt or attr == 'pinfo'): - raise AttributeError(f'dissipitive has no attribute "{attr}". '\ + raise AttributeError(f'dissipative has no attribute "{attr}". '\ f'The possible attributes are:\n {str(diss_opt)}') return super().__getattribute__(attr) @@ -144,7 +144,7 @@ def __setattr__(self, attr, value): self[attr] = value def __getattr__(self, attr): - raise AttributeError(f'dissipitive has no attribute "{attr}". '\ + raise AttributeError(f'dissipative has no attribute "{attr}". '\ f'The possible attributes are:\n {str(diss_opt)}') def __getattribute__(self, attr): @@ -157,7 +157,7 @@ def __repr__(self): return str(self.data()) def data(self): - """Return dissipatvie as dictionary""" + """Return dissipative as dictionary""" return {str(opt): self[opt] for opt in diss_opt} def __init__(self, @@ -191,7 +191,7 @@ def __init__(self, self.design_name = design_name self.setup_name = setup_name - # HFSS desgin: describe junction parameters + # HFSS design: describe junction parameters # TODO: introduce modal labels self.junctions = Dict() # See above for help self.ports = Dict() @@ -200,7 +200,7 @@ def __init__(self, self.dissipative = self._Dissipative() self.options = config.ansys - # Conected to HFSS variable + # Connected to HFSS variable self.app = None self.desktop = None self.project = None @@ -218,7 +218,7 @@ def __init__(self, def save(self): ''' - Return all the data in a dectionary form that can be used to be saved + Return all the data in a dictionary form that can be used to be saved ''' return dict( pinfo=pd.Series(get_instance_vars(self, self._Forbidden)), diff --git a/pyEPR/toolbox/_logging.py b/pyEPR/toolbox/_logging.py index e3c6582..4fee6ea 100644 --- a/pyEPR/toolbox/_logging.py +++ b/pyEPR/toolbox/_logging.py @@ -7,7 +7,7 @@ def set_up_logger(logger): logger.c_handler = logging.StreamHandler() # Jupyter notebooks already has a stream handler on the default log, - # Do not propage upstream to the root logger. + # Do not propagate upstream to the root logger. # https://stackoverflow.com/questions/31403679/python-logging-module-duplicated-console-output-ipython-notebook-qtconsole logger.propagate = False diff --git a/pyEPR/toolbox/plotting.py b/pyEPR/toolbox/plotting.py index c1402e0..5a5e08e 100644 --- a/pyEPR/toolbox/plotting.py +++ b/pyEPR/toolbox/plotting.py @@ -26,7 +26,7 @@ def mpl_dpi(dpi=200): ''' - Set the matpllib resolution for images dots per inch + Set the matplotlib resolution for images dots per inch ''' mpl.rcParams['figure.dpi'] = dpi mpl.rcParams['savefig.dpi'] = dpi @@ -36,7 +36,7 @@ def plt_cla(ax: Axes): ''' Clear all plotted objects on an axis - ax : mapltlib axis + ax : matplotlib axis ''' ax = ax if not ax is None else plt.gca() for artist in ax.lines + ax.collections + ax.patches + ax.images + ax.texts: @@ -67,7 +67,7 @@ def legend_translucent(ax: Axes, values=[], loc=0, alpha=0.5, leg_kw={}): def get_last_color(ax: Axes): ''' - gets the color fothe last plotted line + gets the color for the last plotted line use: datai.plot(label=name, marker='o') data.plot(label=name, marker='o', c=get_last_color(plt.gca())) @@ -141,7 +141,7 @@ def xarr_heatmap(fg, title=None, kwheat={}, fmt=('%.3f', '%.2f'), fig=None): ''' fig = plt.figure() if fig == None else fig df = fg.to_pandas() - # format indecies + # format indices df.index = [float(fmt[0] % x) for x in df.index] df.columns = [float(fmt[1] % x) for x in df.columns] import seaborn as sns @@ -161,7 +161,7 @@ def xarr_heatmap(fg, title=None, kwheat={}, fmt=('%.3f', '%.2f'), fig=None): Not seeing widgets: https://github.com/tqdm/tqdm/issues/451 conda update tqdm - # This might aleady work, will require a lot of updates, if not then do: + # This might already work, will require a lot of updates, if not then do: conda install nodejs jupyter labextension install @jupyter-widgets/jupyterlab-manager jupyter nbextension enable --py widgetsnbextension diff --git a/pyEPR/toolbox/pythonic.py b/pyEPR/toolbox/pythonic.py index 18803be..d0b015c 100644 --- a/pyEPR/toolbox/pythonic.py +++ b/pyEPR/toolbox/pythonic.py @@ -91,7 +91,7 @@ def df_find_index(s: pd.Series, find, degree=2, ax=False): def df_interpolate_value(s: pd.Series, find, ax=False, method='index'): """ Given a Pandas Series such as of freq with index Lj, - find the freq that would correspnd to Lj given a value not in the index + find the freq that would correspond to Lj given a value not in the index """ z = pd.Series(list(s) + [np.NaN], index=list(s.index.values)+[find]) z = z.sort_index() @@ -150,7 +150,7 @@ def sort_df_col(df): ''' sort by numerical int order ''' return df.sort_index(axis=1) - # Buggy code, deosnt handles ints as inputs or floats as inpts + # Buggy code, doesn't handles ints as inputs or floats as inputs col_names = df.columns if np.all(col_names.map(isint)): return df[col_names.astype(int).sort_values().astype(str)] @@ -184,7 +184,7 @@ def get_instance_vars(obj, Forbidden=[]): def deprecated(func): """This is a decorator which can be used to mark functions - as deprecated. It will result in a warning being emmitted + as deprecated. It will result in a warning being emitted when the function is used. See StackExchange""" def newFunc(*args, **kwargs): warnings.simplefilter('always', DeprecationWarning) # turn off filter @@ -268,7 +268,7 @@ class Print_colors: '''Colors class:reset all colors with colors.reset; two sub classes fg for foreground and bg for background; use as colors.subclass.colorname. - i.e. colors.fg.red or colors.bg.greenalso, the generic bold, disable, + i.e. colors.fg.red or colors.bg.green also, the generic bold, disable, underline, reverse, strike through, and invisible work with the main class i.e. colors.bold https://www.geeksforgeeks.org/print-colors-python-terminal/ @@ -323,7 +323,7 @@ class bg: def DataFrame_col_diff(PS, indx=0): ''' check weather the columns of a dataframe are equal, - returns a T/F series of the row index that specifies which rows are differnt + returns a T/F series of the row index that specifies which rows are different USE: PS[DataFrame_col_diff(PS)] ''' diff --git a/scripts/Alec/11ghz/EPR_test.py b/scripts/Alec/11ghz/EPR_test.py index 4701f32..5526848 100644 --- a/scripts/Alec/11ghz/EPR_test.py +++ b/scripts/Alec/11ghz/EPR_test.py @@ -11,7 +11,7 @@ # Specify the HFSS project to be analyzed project_info = ProjectInfo(r"C:\Users\awe4\Documents\Backed\hfss_simulations\11ghz\\") project_info.project_name = '11ghz_alec' # Name of the project file (string). "None" will get the current active one. - project_info.design_name = '11ghz_design1' # Name of the desgin file (string). "None" will get the current active one. + project_info.design_name = '11ghz_design1' # Name of the design file (string). "None" will get the current active one. project_info.setup_name = None # Name of the setup(string). "None" will get the current active one. project_info.junctions['bot_junc'] = {'rect':'bot_junction', 'line': 'bot_junc_line', 'Lj_variable':'bot_lj', 'length':0.0001} diff --git a/scripts/Alec/7ghz/7ghz_pyEPR.py b/scripts/Alec/7ghz/7ghz_pyEPR.py index ad6633f..79484c5 100644 --- a/scripts/Alec/7ghz/7ghz_pyEPR.py +++ b/scripts/Alec/7ghz/7ghz_pyEPR.py @@ -11,13 +11,13 @@ # Specify the HFSS project to be analyzed project_info = ProjectInfo(r"C:\Users\awe4\Documents\Simulations\HFSS\11ghz\\") project_info.project_name = '2017_08_Zlatko_Shyam_AutStab' # Name of the project file (string). "None" will get the current active one. - project_info.design_name = 'pyEPR_2_chips' # Name of the desgin file (string). "None" will get the current active one. + project_info.design_name = 'pyEPR_2_chips' # Name of the design file (string). "None" will get the current active one. project_info.setup_name = None # Name of the setup(string). "None" will get the current active one. - ## Describe the junctions in the HFSS desgin + ## Describe the junctions in the HFSS design project_info.junctions['jAlice'] = {'rect':'qubitAlice', 'line': 'alice_line', 'Lj_variable':'LJAlice', 'length':0.0001} project_info.junctions['jBob'] = {'rect':'qubitBob', 'line': 'bob_line', 'Lj_variable':'LJBob', 'length':0.0001} - # Dissipative elments EPR + # Dissipative elements EPR project_info.dissipative['dielectric_surfaces'] = None # supply names here, there are more options in project_info.dissipative. # Run analysis diff --git a/scripts/Kaicheng/import_pyEPR.py b/scripts/Kaicheng/import_pyEPR.py index 72bb9db..8746539 100644 --- a/scripts/Kaicheng/import_pyEPR.py +++ b/scripts/Kaicheng/import_pyEPR.py @@ -11,14 +11,14 @@ # Specify the HFSS project to be analyzed project_info = ProjectInfo(r"X:\Simulation\\hfss\\KC\\") project_info.project_name = '2013-12-03_9GHzCavity' # Name of the project file (string). "None" will get the current active one. - project_info.design_name = '9GHz_EM_center_SNAIL' # Name of the desgin file (string). "None" will get the current active one. + project_info.design_name = '9GHz_EM_center_SNAIL' # Name of the design file (string). "None" will get the current active one. project_info.setup_name = None # Name of the setup(string). "None" will get the current active one. - ## Describe the junctions in the HFSS desgin + ## Describe the junctions in the HFSS design project_info.junctions['snail'] = {'rect':'qubit', 'line': 'JunctionLine', 'Lj_variable':'LJ', 'length':0.0001} # project_info.junctions['jBob'] = {'rect':'qubitBob', 'line': 'bob_line', 'Lj_variable':'LJBob', 'length':0.0001} - # Dissipative elments EPR + # Dissipative elements EPR project_info.dissipative['dielectric_surfaces'] = None # supply names here, there are more options in project_info.dissipative. # Run analysis diff --git a/scripts/hanhee/run_vs_pass.py b/scripts/hanhee/run_vs_pass.py index 6b72cbf..12aee7a 100644 --- a/scripts/hanhee/run_vs_pass.py +++ b/scripts/hanhee/run_vs_pass.py @@ -161,7 +161,7 @@ def zkm_get_Hparams(PJ, SJ, Om, EJ, PHI): def do_plot(RES): ''' Make sure - %matplolib qt + %matplotlib qt TODO: in future just setup once, and then update lines only ''' # live plot https://stackoverflow.com/questions/11874767/how-do-i-plot-in-real-time-in-a-while-loop-using-matplotlib diff --git a/scripts/minev/hfss-scripts/2017_10 R3C1 resim.py b/scripts/minev/hfss-scripts/2017_10 R3C1 resim.py index a3eee27..f0eb107 100644 --- a/scripts/minev/hfss-scripts/2017_10 R3C1 resim.py +++ b/scripts/minev/hfss-scripts/2017_10 R3C1 resim.py @@ -10,11 +10,11 @@ project_info.design_name = '3. sweep both' project_info.setup_name = None - ## Describe the junctions in the HFSS desgin + ## Describe the junctions in the HFSS design project_info.junctions['jBright'] = {'rect':'juncV', 'line': 'juncH_line', 'Lj_variable':'LJ1', 'length':0.0001} project_info.junctions['jDark'] = {'rect':'juncH', 'line': 'juncV_line', 'Lj_variable':'LJ2', 'length':0.0001} - # Dissipative elments EPR + # Dissipative elements EPR project_info.dissipative['dielectric_surfaces'] = None # supply names here, there are more options in project_info.dissipative. # Run analysis diff --git a/scripts/minev/hfss-scripts/import_pyEPR.py b/scripts/minev/hfss-scripts/import_pyEPR.py index 182fbe9..6d6e1fb 100644 --- a/scripts/minev/hfss-scripts/import_pyEPR.py +++ b/scripts/minev/hfss-scripts/import_pyEPR.py @@ -11,14 +11,14 @@ # Specify the HFSS project to be analyzed project_info = ProjectInfo(r"C:\\Users\\rslqulab\Desktop\\Lysander\participation_ratio_project\\Shyam's autonomous stabilization simulations\\") project_info.project_name = '2017_08_Zlatko_Shyam_AutStab' # Name of the project file (string). "None" will get the current active one. - project_info.design_name = '2 pyEPR' # Name of the desgin file (string). "None" will get the current active one. + project_info.design_name = '2 pyEPR' # Name of the design file (string). "None" will get the current active one. project_info.setup_name = None # Name of the setup(string). "None" will get the current active one. - ## Describe the junctions in the HFSS desgin + ## Describe the junctions in the HFSS design project_info.junctions['jAlice'] = {'rect':'qubitAlice', 'line': 'alice_line', 'Lj_variable':'LJAlice', 'length':0.0001} project_info.junctions['jBob'] = {'rect':'qubitBob', 'line': 'bob_line', 'Lj_variable':'LJBob', 'length':0.0001} - # Dissipative elments EPR + # Dissipative elements EPR project_info.dissipative['dielectric_surfaces'] = None # supply names here, there are more options in project_info.dissipative. # Run analysis diff --git a/scripts/nick/import_pyEPR.py b/scripts/nick/import_pyEPR.py index 0ab53f6..0564998 100644 --- a/scripts/nick/import_pyEPR.py +++ b/scripts/nick/import_pyEPR.py @@ -11,14 +11,14 @@ # Specify the HFSS project to be analyzed project_info = ProjectInfo(r"X:\Simulation\\hfss\\KC\\") project_info.project_name = '2013-12-03_9GHzCavity' # Name of the project file (string). "None" will get the current active one. - project_info.design_name = '9GHz_EM_center_SNAIL' # Name of the desgin file (string). "None" will get the current active one. + project_info.design_name = '9GHz_EM_center_SNAIL' # Name of the design file (string). "None" will get the current active one. project_info.setup_name = None # Name of the setup(string). "None" will get the current active one. - ## Describe the junctions in the HFSS desgin + ## Describe the junctions in the HFSS design project_info.junctions['snail'] = {'rect':'qubit', 'line': 'JunctionLine', 'Lj_variable':'LJ', 'length':0.0001} # project_info.junctions['jBob'] = {'rect':'qubitBob', 'line': 'bob_line', 'Lj_variable':'LJBob', 'length':0.0001} - # Dissipative elments EPR + # Dissipative elements EPR project_info.dissipative['dielectric_surfaces'] = None # supply names here, there are more options in project_info.dissipative. # Run analysis diff --git a/tests/README.md b/tests/README.md index 8c12724..d175cba 100644 --- a/tests/README.md +++ b/tests/README.md @@ -1,5 +1,5 @@ # Unit Tests Module (work-in-progress) -This module is used for development unit testing, to be expandad in the future. +This module is used for development unit testing, to be expanded in the future. This is based on the python [built-in unittest module](https://docs.python.org/3/library/unittest.html). To execute a specific unit test you can run `python -m unittest test_name.py`, or to automatically run all unit tests you can run `python -m unittest` diff --git a/tests/test_project_info.py b/tests/test_project_info.py index 9cbb9dd..828462c 100644 --- a/tests/test_project_info.py +++ b/tests/test_project_info.py @@ -14,7 +14,7 @@ def setUp(self): assert ConnectionError('Failed to connect to HFSS. Opening it manually') def test_dissipative(self): - '''Test change of _Dissipative from a class to a dict with deprecation warninngs''' + '''Test change of _Dissipative from a class to a dict with deprecation warnings''' self.assertRaises(Exception, self.pinfo.dissipative.__getattr__, 'mot_exist', msg='Failed calling non-existing attr') self.assertRaises(Exception, self.pinfo.dissipative.__getitem__, 'not_exist', diff --git a/tests/test_quantum_analysis.py b/tests/test_quantum_analysis.py index eb4dbdf..27672de 100644 --- a/tests/test_quantum_analysis.py +++ b/tests/test_quantum_analysis.py @@ -27,7 +27,7 @@ def test_analyze_all_variations(self): ''' results = self.epra.analyze_all_variations( cos_trunc=8, fock_trunc=15, print_result=False)['0'] # Variation 0 - # TODO: Remove start/finish diagnolization messages (back_box_numeric L:153) + # TODO: Remove start/finish diagonalization messages (back_box_numeric L:153) for key, value in results.items(): if key == 'hfss_variables': # All numeric-only datatypes @@ -42,7 +42,7 @@ def test_analyze_variation(self): pass def test_hamiltonian(self): - pass # TODO: Need to pass **kwargs to epr_num_diag for return_H opption + pass # TODO: Need to pass **kwargs to epr_num_diag for return_H option def test_properties(self): pass