Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix/content dates for new version #365

Merged
merged 8 commits into from
Oct 2, 2023
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 21 additions & 0 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
name: CI
on:
push:
pull_request:
types: [opened, reopened]
k-burt-uch marked this conversation as resolved.
Show resolved Hide resolved
concurrency:
group: '${{ github.workflow }} @ ${{ github.event.pull_request.head.label || github.head_ref || github.ref }}'
cancel-in-progress: true
jobs:
Security:
name: Security Pipeline
uses: uc-cdis/.github/.github/workflows/securitypipeline.yaml@master
with:
python-poetry: 'false'
secrets: inherit

UnitTest:
name: Python Unit Test with Postgres
uses: uc-cdis/.github/.github/workflows/python_unit_test.yaml@master
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the default script runs poetry run pytest -vv but in indexd we have coveralls set up and we want to report coverage updates, so i think we actually need a custom script that does this: https://github.com/uc-cdis/indexd/blob/2023.09/.travis.yml#L20-L23

Copy link
Contributor

@Avantol13 Avantol13 Sep 28, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd actually be okay leaving this as-is, because if we want to add consistent coverage in the future, we could just modify the template. but we'd lose coverage reporting in the meantime. idk, I don't have a strong opinion either way

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems we've lost coverage reporting already, checking the last few closed PRs for indexd.

Checking this PR #361 and this one #360 there's no coveralls run. It seems the last one was Atharva's PR here: #359

I could be missing something, but it seems we've already lost coverage reporting so my vote is to add it to the template in a separate PR. Sound good @paulineribeyre ??

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know why coveralls is not reporting as a check on those PRs, but you can see in coveralls that it did get the report, for example for #361:
Screenshot 2023-09-28 at 4 00 33 PM

But anyway i'm referring to the coverage badge we have on this repo's readme, more than to the PR checks. It would be nice to include the coveralls logic in the template, but since it's not there yet, i think removing it would be a regression and the coverage % on the repo readme would be inaccurate. Tbh i think it's probably not a heavy lift to add it, but if it is for some reason, we can file a ticket instead

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, that makes sense. Given that https://github.com/uc-cdis/sheepdog#sheepdog has broken tags now I'll see what I can do about getting this working here and moving it to the template.

Reading the docs here https://github.com/marketplace/actions/coveralls-github-action it looks fairly straightforward but I might have to DM you with questions if I run into something I can't find in the coveralls config or guides.

with:
python-version: '3.9'
26 changes: 13 additions & 13 deletions .secrets.baseline
Original file line number Diff line number Diff line change
Expand Up @@ -94,13 +94,13 @@
}
],
"results": {
".travis.yml": [
".github/workflows/ci.yaml": [
{
"type": "Base64 High Entropy String",
"filename": ".travis.yml",
"hashed_secret": "8c0fdc72bf4daf2a2338287334d0cbd221fa9ef4",
"type": "Secret Keyword",
"filename": ".github/workflows/ci.yaml",
"hashed_secret": "3e26d6750975d678acb8fa35a0f69237881576b0",
"is_verified": false,
"line_number": 27
"line_number": 15
}
],
"README.md": [
Expand Down Expand Up @@ -349,49 +349,49 @@
"filename": "tests/test_client.py",
"hashed_secret": "ff9c79b737b3ea7386618cc9437d3fb0a772182b",
"is_verified": false,
"line_number": 2406
"line_number": 2408
},
{
"type": "Hex High Entropy String",
"filename": "tests/test_client.py",
"hashed_secret": "c8176f1e75e62e15dabaa4087fb7194451c8f6d2",
"is_verified": false,
"line_number": 2409
"line_number": 2411
},
{
"type": "Hex High Entropy String",
"filename": "tests/test_client.py",
"hashed_secret": "d5198f8eddb1cbeb437899cd99e5ee97ab8531b4",
"is_verified": false,
"line_number": 2409
"line_number": 2411
},
{
"type": "Hex High Entropy String",
"filename": "tests/test_client.py",
"hashed_secret": "02dc196562514eaa3e2feac1f441ccf6ad81e09d",
"is_verified": false,
"line_number": 2413
"line_number": 2415
},
{
"type": "Hex High Entropy String",
"filename": "tests/test_client.py",
"hashed_secret": "f1cb2d91a95165a2ab909eadd9f7b65f312c7e2d",
"is_verified": false,
"line_number": 2414
"line_number": 2416
},
{
"type": "Hex High Entropy String",
"filename": "tests/test_client.py",
"hashed_secret": "58db546de03270b55a4c889a5c5e6296b29fef25",
"is_verified": false,
"line_number": 2415
"line_number": 2417
},
{
"type": "Hex High Entropy String",
"filename": "tests/test_client.py",
"hashed_secret": "b6c0bd08fde409c18760f32bef8705191840c402",
"is_verified": false,
"line_number": 2416
"line_number": 2418
}
],
"tests/test_deprecated_aliases_endpoints.py": [
Expand All @@ -413,5 +413,5 @@
}
]
},
"generated_at": "2023-04-20T22:58:41Z"
"generated_at": "2023-09-27T23:03:38Z"
}
27 changes: 0 additions & 27 deletions .travis.yml

This file was deleted.

7 changes: 4 additions & 3 deletions indexd/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,17 +17,18 @@
from .blueprint import blueprint as cross_blueprint
from indexd.urls.blueprint import blueprint as index_urls_blueprint

logger = cdislogging.get_logger(__name__)


def app_init(app, settings=None):
app.url_map.strict_slashes = False
app.logger.addHandler(cdislogging.get_stream_handler())
if not settings:
from .default_settings import settings
app.config.update(settings["config"])

if settings.get("AUTO_MIGRATE", True):
engine_name = settings["config"]["INDEX"]["driver"].engine.dialect.name
app.logger.info(f"Auto migrating. Engine name: {engine_name}")
logger.info(f"Auto migrating. Engine name: {engine_name}")
if engine_name == "sqlite":
IndexBase.metadata.create_all()
AliasBase.metadata.create_all()
Expand All @@ -37,7 +38,7 @@ def app_init(app, settings=None):
else:
alembic_main(["--raiseerr", "upgrade", "head"])
else:
app.logger.info("Auto migrations are disabled")
logger.info("Auto migrations are disabled")

app.auth = settings["auth"]
app.hostname = os.environ.get("HOSTNAME") or "http://example.io"
Expand Down
35 changes: 25 additions & 10 deletions indexd/index/drivers/alchemy.py
Original file line number Diff line number Diff line change
Expand Up @@ -676,6 +676,20 @@ def get_urls(self, size=None, hashes=None, ids=None, start=0, limit=100):
for r in query
]

def _validate_and_format_content_dates(
self, record, content_created_date, content_updated_date
):
if content_created_date is not None:
record.content_created_date = datetime.datetime.fromisoformat(
content_created_date
)
# Users cannot set content_updated_date without a content_created_date
record.content_updated_date = (
datetime.datetime.fromisoformat(content_updated_date)
if content_updated_date is not None
else record.content_created_date # Set updated to created if no updated is provided
)

def add(
self,
form,
Expand Down Expand Up @@ -755,16 +769,11 @@ def add(

record.description = description

if content_created_date is not None:
record.content_created_date = datetime.datetime.fromisoformat(
content_created_date
)
# Users cannot set content_updated_date without a content_created_date
record.content_updated_date = (
datetime.datetime.fromisoformat(content_updated_date)
if content_updated_date is not None
else record.content_created_date # Set updated to created if no updated is provided
)
self._validate_and_format_content_dates(
record=record,
content_created_date=content_created_date,
content_updated_date=content_updated_date,
)

session.merge(base_version)

Expand Down Expand Up @@ -1384,6 +1393,12 @@ def add_version(
for m_key, m_value in metadata.items()
]

self._validate_and_format_content_dates(
record=record,
content_created_date=content_created_date,
content_updated_date=content_updated_date,
)

try:
session.add(record)
create_urls_metadata(urls_metadata, record, session)
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "indexd"
version = "5.0.0"
version = "5.0.2"
description = "Gen3 Indexing Service"
authors = ["CTDS UChicago <cdis@uchicago.edu>"]
license = "Apache-2.0"
Expand Down
2 changes: 2 additions & 0 deletions tests/test_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -2115,6 +2115,8 @@ def test_create_index_version(client, user):
"urls": ["s3://endpointurl/bucket2/key"],
"hashes": {"md5": "8b9942cf415384b27cadf1f4d2d981f5"},
"acl": ["a"],
"content_updated_date": "2023-03-14T17:02:54",
"content_created_date": "2023-03-13T17:02:54",
}

res_2 = client.post("/index/" + rec["did"], json=dataNew, headers=user)
Expand Down
Loading