- Hello ${fullname},
-
- Your submission [${job_id}] of file [${query_file_full_path}] to Boa is successful.
-
- The result has been uploaded to OSF and stored in file [${output_file_name}] under the same folder where you submit the file.
- Visit your project to access the result.
-
- In addition, the Boa job ID for this submission is [${job_id}]. Visit Boa's job list page for more information.
-
- Sincerely,
-
- The OSF Team
-
- Hello ${fullname},
-
- Your submission of file [${query_file_full_path}] from your OSF project to Boa has failed.
-
- % if code == 1:
- OSF can not log in to Boa. Please fix your Boa addon configuration on OSF and try again.
-
- For details, visit Boa's job list page. The Boa job ID for this submission is [${job_id}].
- % elif code == 2:
- The query you submitted encountered compile or run-time error. Please fix your query file and try again.
-
- For details, visit Boa's job list page. The Boa job ID for this submission is [${job_id}].
- % elif code == 3:
- Your query has completed on Boa and the job ID is [${job_id}].
-
- However, we were not able to upload the result to your OSF project because an existing output file [${output_file_name}] already exists.
-
- Please either rename your query file or remove the existing result file and try again.
-
- In addition, you can visit Boa's job list page to retrieve the results.
- % elif code == 4:
- Your query has completed on Boa and the job ID is [${job_id}]. However, we were not able to upload the result to OSF.
-
- Visit Boa's job list page to retrieve the results.
- % elif code == 5:
- Your query has completed on Boa and the job ID is [${job_id}]. However, we were not able to retrieve the output from Boa.
-
- A common cause of this failure is that the output is empty. Visit Boa's job list page to check if the output is empty.
-
- If you believe this is in error, contact Boa Support at ${boa_support_email}.
- % elif code == 6:
- OSF cannot submit your query file to Boa since it is too large: [${file_size} Bytes] is over the maximum allowed threshold [${max_file_size} Bytes].
-
- If you believe this is in error, contact OSF Help Desk at ${osf_support_email}.
- % elif code == 7:
- It's been ${max_job_wait_hours} hours since we submitted your query job [${job_id}] to Boa.
-
- However, OSF haven't received confirmation from Boa that the job has been finished.
-
- Visit Boa's job list page to check it's status.
-
- If you believe this is in error, contact OSF Help Desk at ${osf_support_email}.
- % else:
- OSF encountered an unexpected error when connecting to Boa. Please try again later.
-
- If this issue persists, contact OSF Help Desk at ${osf_support_email} and attach the following error message.
-
- ${message}
- % endif
-
- Sincerely,
-
- The OSF Team
-
- Hello ${fullname},
-
- You recently attempted to interact with the Meeting service via email, but this service has been discontinued and is no longer available for new interactions.
-
- Existing meetings and past submissions remain unchanged. If you have any questions or need further assistance, please contact our support team at [ ${support_email} ].
-
- Sincerely yours,
-
- The OSF Robot
-
-
-%def>
\ No newline at end of file
diff --git a/website/templates/emails/confirm_agu_conference.html.mako b/website/templates/emails/confirm_agu_conference.html.mako
deleted file mode 100644
index 603e2c39e8d..00000000000
--- a/website/templates/emails/confirm_agu_conference.html.mako
+++ /dev/null
@@ -1,26 +0,0 @@
-<%inherit file="notify_base.mako" />
-
-<%def name="content()">
-
-
- Hello ${user.fullname},
-
-
- Thank you for joining us at the AGU Open Science Pavilion, and welcome to the Open Science Framework (OSF).
-
- We are pleased to offer a special AGU attendees exclusive 1:1 consultation to continue our conversation and to help
- you get oriented on the OSF. This is an opportunity for us to show you useful OSF features, talk about
- open science in Earth and space sciences, and for you to ask any questions you may have.
- You can sign up to participate by completing this form, and a member of our team will be in touch to
- determine your availability:
-
- https://docs.google.com/forms/d/e/1FAIpQLSeJ23YPaEMdbLY1OqbcP85Tt6rhLpFoOtH0Yg4vY_wSKULRcw/viewform?usp=sf_link
-
- To confirm your OSF account, please verify your email address by visiting this link:
-
- ${confirmation_url}
-
- From the team at the Center for Open Science
-
-
- Hello ${user.fullname},
-
-
- Thank you for joining us at the AGU Open Science Pavilion, and welcome to the Open Science Framework.
-
- We are pleased to offer a special AGU attendees exclusive community call to continue our conversation and to help
- you get oriented on the OSF. This is an opportunity for us to show you useful OSF features, talk about
- open science in your domains, and for you to ask any questions you may have.
- You can register for this free event here:
-
- https://cos-io.zoom.us/meeting/register/tZAuceCvrjotHNG3n6XzLFDv1Rnn2hkjczHr
-
- To confirm your OSF account, please verify your email address by visiting this link:
-
- ${confirmation_url}
-
- From the team at the Center for Open Science
-
-
Your OSF login has changed - here's what you need to know!
-
-
-
-
- Hello, ${user.fullname},
-
- Starting today, you can no longer sign into OSF using your institution's SSO. However, you will not lose access to your account or your OSF content.
-
- You can still access your OSF account using your institutional email by adding a password, or using your ORCID credentials (if your institutional email address is associated with your ORCID record).
- We also recommend having multiple ways to access your account by connecting your ORCID
- or alternate email addresses with your account.
-
- Click here to set a password
-
- If you have any issues, questions or need our help, contact ${osf_support_email} and we will be happy to assist.
- You may find this help guide useful.
-
- Sincerely,
-
- The OSF Team
-
- The Quick Files feature has been discontinued and your files have been migrated into an OSF Project. You can find the new Project on your My Projects page, entitled "${user.fullname}'s Quick Files". Your favorite Quick Files features are still present; you can view, download, and share your files from their new location. Your file URL's will also continue to resolve properly, and you can still move your files between Projects by linking your Projects. Contact ${settings.OSF_CONTACT_EMAIL} if you have any questions or concerns.
-
-
- Thank you for partnering with us as a stakeholder in open science and in the success of the infrastructure that help make it possible.
-
-
- The Center for Open Science Team
-
-
- Sincerely,
- The OSF Team
-
-
- Want more information? Visit ${settings.DOMAIN} to learn about the OSF,
- or https://cos.io/ for information about its supporting organization,
- the Center for Open Science.
-
Registrations Were Not Bulk Uploaded to your Community's Registry
-
-
-
-
- Hello ${fullname},
-
- All ${count} registrations could not be uploaded due to duplicate rows found either within the uploaded csv file
- or in our system. Duplicates are listed below. Review the file and try to upload the registrations again after
- removing duplicates. Contact the Help Desk at ${osf_support_email} if
- you continue to have issues.
-
-
- Hello,
-
- [${user}] from registry [${provider_name}] attempted to upload the registrations from a csv file. Review the
- file and inform the engineers of the issue. The registry has been notified of the problem and is waiting on a
- response. Below is the error message provided by the system.
-
- ${message}
-
- Sincerely,
-
- The OSF Team
-
Registrations Were Not Bulk Uploaded to your Community's Registry
-
-
-
-
- Hello ${fullname},
-
- Your registrations were not uploaded. Our team was notified of the issue and will follow up after they start
- looking into the issue. Contact the Help Desk at ${osf_support_email}
- if you continue to have questions.
-
- Sincerely,
-
- The OSF Team
-
- This message is coming from an Institutional administrator within your Institution.
-
- % if message_text:
-
- ${message_text}
-
- % endif
-
- Want more information? Visit ${settings.DOMAIN} to learn about OSF, or
- https://cos.io/ for information about its supporting organization, the Center
- for Open Science.
-
- Groups:
- %for i, group_name in enumerate(node['groups']):
-
- % if i == len(node['groups']) - 1:
- ${group_name}
- % else:
- ${group_name},
- % endif
-
- %endfor
-
- % endif
% if enable_institutions and not node['anonymous']:
% if (permissions.ADMIN in user['permissions'] and not node['is_registration']) and (len(node['institutions']) != 0 or len(user['institutions']) != 0):
Affiliated Institutions:
diff --git a/website/templates/search.mako b/website/templates/search.mako
index 78ec1b10e3d..bad65e38d81 100644
--- a/website/templates/search.mako
+++ b/website/templates/search.mako
@@ -249,16 +249,6 @@
- Hello ${fullname},
-
- You recently attempted to interact with the Meeting service via email, but this service has been discontinued and is no longer available for new interactions.
-
- Existing meetings and past submissions remain unchanged. If you have any questions or need further assistance, please contact our support team at [ ${support_email} ].
-
- Sincerely yours,
-
- The OSF Robot
-
-
-%def>
\ No newline at end of file
diff --git a/website/templates/emails/conference_failed.html.mako b/website/templates/emails/conference_failed.html.mako
deleted file mode 100644
index c64e44f210e..00000000000
--- a/website/templates/emails/conference_failed.html.mako
+++ /dev/null
@@ -1,16 +0,0 @@
-<%inherit file="notify_base.mako" />
-
-<%def name="content()">
-
-
- Hello ${fullname},
-
- You recently tried to create a project on the Open Science Framework via email, but your message did not contain any file attachments. Please try again, making sure to attach the files you'd like to upload to your message.
-
-
- Sincerely yours,
-
- The OSF Robot
-
-
- Hello ${fullname},
-
- You recently tried to create a project on the Open Science Framework via email, but the conference you attempted to submit to is not currently accepting new submissions. For a list of conferences, see [ ${presentations_url} ].
-
- Sincerely yours,
-
- The OSF Robot
-
-
- Hello ${fullname},
-
- Congratulations! You have successfully added your ${conf_full_name} ${presentation_type} to OSF.
-
- % if user_created:
- Your account on OSF has been created. To claim your account, please create a password by clicking here: ${set_password_url}. Please verify your profile information at: ${profile_url}.
-
- % endif
- You now have a permanent, citable URL, that you can share: ${node_url}. All submissions for ${conf_full_name} may be viewed at the following link: ${conf_view_url}.
-
- % if is_spam:
- Your email was flagged as spam by our mail processing service. To prevent potential spam, we have made your project private. If this is a real project, please log in to your account, browse to your project, and click the "Make Public" button so that other users can view it.
-
- % endif
- Get more from OSF by enhancing your project with the following:
-
- * Collaborators/contributors to the submission
- * Charts, graphs, and data that didn't make it onto the submission
- * Links to related publications or reference lists
- * Connecting other accounts, like Dropbox, Google Drive, GitHub, figshare and Mendeley via add-on integration. Learn more and read the full list of available add-ons here.
-
- To learn more about OSF, read the Guides.
-
- Sincerely,
-
- The OSF Team
-
-
- Hello ${user.fullname},
-
-
- Thank you for joining us at the AGU Open Science Pavilion, and welcome to the Open Science Framework (OSF).
-
- We are pleased to offer a special AGU attendees exclusive 1:1 consultation to continue our conversation and to help
- you get oriented on the OSF. This is an opportunity for us to show you useful OSF features, talk about
- open science in Earth and space sciences, and for you to ask any questions you may have.
- You can sign up to participate by completing this form, and a member of our team will be in touch to
- determine your availability:
-
- https://docs.google.com/forms/d/e/1FAIpQLSeJ23YPaEMdbLY1OqbcP85Tt6rhLpFoOtH0Yg4vY_wSKULRcw/viewform?usp=sf_link
-
- To confirm your OSF account, please verify your email address by visiting this link:
-
- ${confirmation_url}
-
- From the team at the Center for Open Science
-
-
- Hello ${user.fullname},
-
-
- Thank you for joining us at the AGU Open Science Pavilion, and welcome to the Open Science Framework.
-
- We are pleased to offer a special AGU attendees exclusive community call to continue our conversation and to help
- you get oriented on the OSF. This is an opportunity for us to show you useful OSF features, talk about
- open science in your domains, and for you to ask any questions you may have.
- You can register for this free event here:
-
- https://cos-io.zoom.us/meeting/register/tZAuceCvrjotHNG3n6XzLFDv1Rnn2hkjczHr
-
- To confirm your OSF account, please verify your email address by visiting this link:
-
- ${confirmation_url}
-
- From the team at the Center for Open Science
-
-
- <%!
- from website import settings
- %>
- Hello ${user.fullname},
-
- ${referrer_name + ' has given your group, ' + group_name + ',' if referrer_name else 'Your group, ' + group_name + ', has been given'} ${permission} permissions to the project "${node.title}" on OSF: ${node.absolute_url}
-
- You will ${'not receive ' if all_global_subscriptions_none else 'be automatically subscribed to '}notification emails for this project. To change your email notification preferences, visit your project or your user settings: ${settings.DOMAIN + "settings/notifications/"}
-
- Sincerely,
-
- Open Science Framework Robot
-
- Want more information? Visit https://osf.io/ to learn about the Open Science Framework, or https://cos.io/ for information about its supporting organization, the Center for Open Science.
-
- Questions? Email ${osf_contact_email}
-
-
- <%!
- from website import settings
- %>
- Hello ${user.fullname},
-
- ${referrer_name + ' has added you' if referrer_name else 'You have been added'} as a ${permission} of the group "${group_name}" on OSF.
-
- If you have erroneously been added to the group "${group_name}," please contact a group administrator.
-
- Sincerely,
-
- Open Science Framework Robot
-
- Want more information? Visit https://osf.io/ to learn about the Open Science Framework, or https://cos.io/ for information about its supporting organization, the Center for Open Science.
-
- Questions? Email ${osf_contact_email}
-
-
- <%!
- from website import settings
- %>
- Hello ${user.fullname},
-
- ${referrer_name + ' has added you' if referrer_name else 'You have been added'} to the group "${group_name}" on OSF. To set a password for your account, visit:
-
- ${claim_url}
-
- Once you have set a password, you will be able to create your own groups and projects.
-
- If you are not ${user.fullname} or you are erroneously being associated with "${group_name}," please email ${osf_contact_email} with the subject line "Claiming Error" to report the problem.
-
- Sincerely,
-
- The OSF Team
-
-
- The Quick Files feature has been discontinued and your files have been migrated into an OSF Project. You can find the new Project on your My Projects page, entitled "${user.fullname}'s Quick Files". Your favorite Quick Files features are still present; you can view, download, and share your files from their new location. Your file URL's will also continue to resolve properly, and you can still move your files between Projects by linking your Projects. Contact ${settings.OSF_CONTACT_EMAIL} if you have any questions or concerns.
-
-
- Thank you for partnering with us as a stakeholder in open science and in the success of the infrastructure that help make it possible.
-
-
- The Center for Open Science Team
-
-
- Sincerely,
- The OSF Team
-
-
- Want more information? Visit ${settings.DOMAIN} to learn about the OSF,
- or https://cos.io/ for information about its supporting organization,
- the Center for Open Science.
-
From 33e86b381d10ab47292c710d0c93ffa71412e949 Mon Sep 17 00:00:00 2001
From: ihorsokhanexoft
Date: Mon, 30 Jun 2025 18:14:04 +0300
Subject: [PATCH 035/336] upgrade django to 4.2.17 (#11173)
## Purpose
Because Django 4.2.15 version has a vulnerability, it was upgraded to 4.2.17
## Changes
Updated pyproject.toml and lock file
## Ticket
https://openscience.atlassian.net/browse/ENG-8176
---
poetry.lock | 11 ++++++-----
pyproject.toml | 2 +-
2 files changed, 7 insertions(+), 6 deletions(-)
diff --git a/poetry.lock b/poetry.lock
index 27061024317..28d99cf780e 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -1,4 +1,4 @@
-# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
+# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
[[package]]
name = "amqp"
@@ -926,13 +926,13 @@ files = [
[[package]]
name = "django"
-version = "4.2.15"
+version = "4.2.17"
description = "A high-level Python web framework that encourages rapid development and clean, pragmatic design."
optional = false
python-versions = ">=3.8"
files = [
- {file = "Django-4.2.15-py3-none-any.whl", hash = "sha256:61ee4a130efb8c451ef3467c67ca99fdce400fedd768634efc86a68c18d80d30"},
- {file = "Django-4.2.15.tar.gz", hash = "sha256:c77f926b81129493961e19c0e02188f8d07c112a1162df69bfab178ae447f94a"},
+ {file = "Django-4.2.17-py3-none-any.whl", hash = "sha256:3a93350214ba25f178d4045c0786c61573e7dbfa3c509b3551374f1e11ba8de0"},
+ {file = "Django-4.2.17.tar.gz", hash = "sha256:6b56d834cc94c8b21a8f4e775064896be3b4a4ca387f2612d4406a5927cd2fdc"},
]
[package.dependencies]
@@ -1113,6 +1113,7 @@ python-versions = "*"
files = [
{file = "django-sendgrid-v5-1.2.3.tar.gz", hash = "sha256:3887aafbb10d5b808efc2c1031dcd96fd357d542eb5affe38fef07cc0f3cfae9"},
{file = "django_sendgrid_v5-1.2.3-py2.py3-none-any.whl", hash = "sha256:2d2fa8a085d21c95e5f97fc60b61f199ccc57a27df8da90cd3f29a5702346dc6"},
+ {file = "django_sendgrid_v5-1.2.3-py3-none-any.whl", hash = "sha256:f6a44ee37c1c3cc7d683a43c55ead530417be1849a8a41bde02b158009559d9d"},
]
[package.dependencies]
@@ -4492,4 +4493,4 @@ testing = ["coverage (>=5.0.3)", "zope.event", "zope.testing"]
[metadata]
lock-version = "2.0"
python-versions = "^3.12"
-content-hash = "81b3fc071f1be070d1072d4cfe1a45c8c44815e803c4ba17cf6da85a6b7b3894"
+content-hash = "97027b7b20e0909d572fa21fc49a80fc5d67cfc61f40262928e239086a6c46cf"
diff --git a/pyproject.toml b/pyproject.toml
index 5e74a6defb3..dab80daf6da 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -56,7 +56,7 @@ requests-oauthlib = "1.3.1"
sentry-sdk = {version= "2.2.0", extras = ["django", "flask", "celery"]}
django-redis = "5.4.0"
# API requirements
-Django = "4.2.15"
+Django = "4.2.17"
djangorestframework = "3.15.1"
django-cors-headers = "4.3.1"
djangorestframework-bulk = "0.2.1"
From 6c23802d6579f0181bf956194a15b13ccf35d093 Mon Sep 17 00:00:00 2001
From: ihorsokhanexoft
Date: Mon, 30 Jun 2025 20:05:58 +0300
Subject: [PATCH 036/336] added retry to avoid race condition (#11179)
## Purpose
User received an email of archive failure regardless of it was successful. It may happen because `archive_success` is run asynchronously
https://github.com/CenterForOpenScience/osf.io/blob/2328dd60f55e9c1281dcb29dcd45a78a7fd2cc5f/website/archiver/listeners.py#L33-L49
and may be finished before the main thread finishes archiving process. So at first `archive_success` is processed and no files found, thus email is sent, then the main thread finishes files processing and this archiving is successful actually.
Also it's possible that the celery queue had too many tasks to process and when the main thread finishes archiving, user sees his registration and when celery processes `archive_success` tasks that fails, user receives this email.
## Changes
Added a one-time retry.
## Ticket
https://openscience.atlassian.net/browse/ENG-8175?atlOrigin=eyJpIjoiMjg4MWM1YWI1ZTE3NDMyZmEyODk2Y2QxZjlhNjFlOGQiLCJwIjoiaiJ9
---
website/archiver/tasks.py | 18 ++++++++++++++----
1 file changed, 14 insertions(+), 4 deletions(-)
diff --git a/website/archiver/tasks.py b/website/archiver/tasks.py
index f8c3b18feb1..42e5bfb568b 100644
--- a/website/archiver/tasks.py
+++ b/website/archiver/tasks.py
@@ -118,7 +118,7 @@ def on_failure(self, exc, task_id, args, kwargs, einfo):
dst.save()
sentry.log_message(
- 'An error occured while archiving node',
+ f'An error occured while archiving node: {src._id} and registration: {dst._id}',
extra_data={
'source node guid': src._id,
'registration node guid': dst._id,
@@ -325,9 +325,9 @@ def archive(job_pk):
)
-@celery_app.task(base=ArchiverTask, ignore_result=False)
+@celery_app.task(bind=True, base=ArchiverTask, ignore_result=False, max_retries=1, default_retry_delay=60 * 5, acks_late=True)
@logged('archive_success')
-def archive_success(dst_pk, job_pk):
+def archive_success(self, dst_pk, job_pk):
"""Archiver's final callback. For the time being the use case for this task
is to rewrite references to files selected in a registration schema (the Prereg
Challenge being the first to expose this feature). The created references point
@@ -352,7 +352,17 @@ def archive_success(dst_pk, job_pk):
# Update file references in the Registration's responses to point to the archived
# file on the Registration instead of the "live" version on the backing project
- utils.migrate_file_metadata(dst)
+ try:
+ utils.migrate_file_metadata(dst)
+ except ArchivedFileNotFound as err:
+ sentry.log_message(
+ f'Some files were not found while archiving the node {dst_pk}',
+ extra_data={
+ 'missing_files': err.missing_files,
+ },
+ )
+ self.retry(exc=err)
+
job = ArchiveJob.load(job_pk)
if not job.sent:
job.sent = True
From ffaef7abe724e7a42f8ecb6001f7c4a523da4138 Mon Sep 17 00:00:00 2001
From: antkryt
Date: Mon, 30 Jun 2025 20:11:16 +0300
Subject: [PATCH 037/336] [ENG-8096] Admins on projects are unable to reject
user access requests (#11163)
## Purpose
Add default value when reject access request
## Ticket
https://openscience.atlassian.net/browse/ENG-8096
---
website/static/js/accessRequestManager.js | 2 ++
website/templates/project/contributors.mako | 5 ++++-
2 files changed, 6 insertions(+), 1 deletion(-)
diff --git a/website/static/js/accessRequestManager.js b/website/static/js/accessRequestManager.js
index 96446cd3a34..bed7b65696f 100644
--- a/website/static/js/accessRequestManager.js
+++ b/website/static/js/accessRequestManager.js
@@ -62,6 +62,8 @@ var AccessRequestModel = function(accessRequest, pageOwner, isRegistration, isPa
self.respondToAccessRequest = function(trigger, data, event) {
$osf.trackClick('button', 'click', trigger + '-project-access');
$osf.block();
+ data = data || {};
+
var requestUrl = $osf.apiV2Url('actions/requests/nodes/');
var payload = self.requestAccessPayload(trigger, data.permissions, data.visible);
var request = $osf.ajaxJSON(
diff --git a/website/templates/project/contributors.mako b/website/templates/project/contributors.mako
index d58a04f7bd7..1736d73cf53 100644
--- a/website/templates/project/contributors.mako
+++ b/website/templates/project/contributors.mako
@@ -433,7 +433,10 @@
visible: visible()
})}"
> Add
-
+
\ No newline at end of file
diff --git a/admin/templates/nodes/node.html b/admin/templates/nodes/node.html
index 33b5731e32c..dba2973116b 100644
--- a/admin/templates/nodes/node.html
+++ b/admin/templates/nodes/node.html
@@ -27,7 +27,7 @@
-
From 689aa782ba5b7e418821fb7a223fb7fafdfbb3d6 Mon Sep 17 00:00:00 2001
From: antkryt
Date: Mon, 30 Jun 2025 20:15:17 +0300
Subject: [PATCH 039/336] don't add multiple group perms for preprint provider
(#11159)
## Purpose
don't add multiple group perms for preprint provider
## Changes
- check if user already belongs to some provider group
- change checkbox to radio select
## Ticket
https://openscience.atlassian.net/browse/ENG-8016
---
admin/preprint_providers/forms.py | 4 ++--
admin/preprint_providers/views.py | 8 ++++++--
admin/providers/views.py | 3 +--
3 files changed, 9 insertions(+), 6 deletions(-)
diff --git a/admin/preprint_providers/forms.py b/admin/preprint_providers/forms.py
index 1393aae41ef..cb7a0f1b1e9 100644
--- a/admin/preprint_providers/forms.py
+++ b/admin/preprint_providers/forms.py
@@ -116,10 +116,10 @@ def __init__(self, *args, provider_groups=None, **kwargs):
super().__init__(*args, **kwargs)
provider_groups = provider_groups or Group.objects.none()
- self.fields['group_perms'] = forms.ModelMultipleChoiceField(
+ self.fields['group_perms'] = forms.ModelChoiceField(
queryset=provider_groups,
required=False,
- widget=forms.CheckboxSelectMultiple
+ widget=forms.RadioSelect
)
user_id = forms.CharField(required=True, max_length=5, min_length=5)
diff --git a/admin/preprint_providers/views.py b/admin/preprint_providers/views.py
index 4c7439f4554..d841981fe84 100644
--- a/admin/preprint_providers/views.py
+++ b/admin/preprint_providers/views.py
@@ -481,10 +481,14 @@ def form_valid(self, form):
if not osf_user:
raise Http404(f'OSF user with id "{user_id}" not found. Please double check.')
- for group in form.cleaned_data.get('group_perms'):
- self.target_provider.add_to_group(osf_user, group)
+ if osf_user.has_groups(self.target_provider.group_names):
+ messages.error(self.request, f'User with guid: {user_id} is already a moderator or admin')
+ return super().form_invalid(form)
+ group = form.cleaned_data.get('group_perms')
+ self.target_provider.add_to_group(osf_user, group)
osf_user.save()
+
messages.success(self.request, f'Permissions update successful for OSF User {osf_user.username}!')
return super().form_valid(form)
diff --git a/admin/providers/views.py b/admin/providers/views.py
index e42d25bb5c9..d21cd65a93b 100644
--- a/admin/providers/views.py
+++ b/admin/providers/views.py
@@ -29,8 +29,7 @@ def post(self, request, *args, **kwargs):
messages.error(request, f'User for guid: {data["add-moderators-form"][0]} could not be found')
return redirect(f'{self.url_namespace}:add_admin_or_moderator', provider_id=provider.id)
- groups = [provider.format_group(name) for name in provider.groups.keys()]
- if target_user.has_groups(groups):
+ if target_user.has_groups(provider.group_names):
messages.error(request, f'User with guid: {data["add-moderators-form"][0]} is already a moderator or admin')
return redirect(f'{self.url_namespace}:add_admin_or_moderator', provider_id=provider.id)
From a02120f639a3e3a20fd1531c64f0bdc1903b134f Mon Sep 17 00:00:00 2001
From: ihorsokhanexoft
Date: Mon, 30 Jun 2025 20:17:25 +0300
Subject: [PATCH 040/336] fixed children/parent fields in admin templates
(#11156)
## Purpose
Admin templates used nonexistent fields to display children and parent of a node (potentially new fields were added but the old fields weren't replaced by new ones). The templates used `parent` and `children` fields. However an endpoint that adds children and parent to a node uses fields `descendants` (or through `NodeRelation` using `get_nodes` method) and `parent_node` property through `_parents` field
## Changes
Use correct fields
## Notes
Deletion of children nodes that are displayed now is broken. Together with Mark decided to create a separate ticket for this issue
## Ticket
https://openscience.atlassian.net/browse/ENG-7969
---
admin/nodes/views.py | 1 +
admin/templates/nodes/node.html | 6 +++---
2 files changed, 4 insertions(+), 3 deletions(-)
diff --git a/admin/nodes/views.py b/admin/nodes/views.py
index 40cf261945d..c2bc48774bf 100644
--- a/admin/nodes/views.py
+++ b/admin/nodes/views.py
@@ -107,6 +107,7 @@ def get_context_data(self, **kwargs):
'SPAM_STATUS': SpamStatus,
'STORAGE_LIMITS': settings.StorageLimits,
'node': node,
+ 'children': node.get_nodes(is_node_link=False),
'duplicates': detailed_duplicates
})
diff --git a/admin/templates/nodes/node.html b/admin/templates/nodes/node.html
index dba2973116b..9cba3a4255c 100644
--- a/admin/templates/nodes/node.html
+++ b/admin/templates/nodes/node.html
@@ -85,10 +85,10 @@
- Thank you for connecting to OSF through your institution. We have found two OSF accounts associated with your institutional identity: <${user.username}>(${user._id}) and <${duplicate_user.username}>(${duplicate_user._id}). We have made <${user.username}> the account primarily associated with your institution.
+ Thank you for connecting to OSF through your institution. We have found two OSF accounts associated with your institutional identity: <${user_username}>(${user__id}) and <${duplicate_user_username}>(${duplicate_user__id}). We have made <${user_username}> the account primarily associated with your institution.
- If <${duplicate_user.username}> is also your account, we would encourage you to merge it into your primary account. Instructions for merging your accounts can be found at: Merge Your Accounts. This action will move all projects and components associated with <${duplicate_user.username}> into the <${user.username}> account.
+ If <${duplicate_user.username}> is also your account, we would encourage you to merge it into your primary account. Instructions for merging your accounts can be found at: Merge Your Accounts. This action will move all projects and components associated with <${duplicate_user_username}> into the <${user_username}> account.
- If you want to keep <${duplicate_user.username}> separate from <${user.username}> you will need to log into that account with your email and OSF password instead of the institutional authentication.
+ If you want to keep <${duplicate_user_username}> separate from <${user_username}> you will need to log into that account with your email and OSF password instead of the institutional authentication.
If you have any issues, questions or need our help, contact ${osf_support_email} and we will be happy to assist.
% if is_moderated:
diff --git a/website/templates/emails/pending_embargo_termination_admin.html.mako b/website/templates/emails/pending_embargo_termination_admin.html.mako
index cfe2642d521..b8660112f4e 100644
--- a/website/templates/emails/pending_embargo_termination_admin.html.mako
+++ b/website/templates/emails/pending_embargo_termination_admin.html.mako
@@ -8,10 +8,10 @@
% if is_initiator:
You have requested final approvals to end the embargo for your registration
- titled ${reviewable.title}
+ titled ${reviewable_title}
% else:
${initiated_by} has requested final approvals to end the embargo for your registration
- titled ${reviewable.title}
+ titled ${reviewable_title}
% endif
If all admins contributors appove, the registration will be made public as part of the
diff --git a/website/templates/emails/pending_registration_admin.html.mako b/website/templates/emails/pending_registration_admin.html.mako
index bbc1e7821f9..c0c669b4755 100644
--- a/website/templates/emails/pending_registration_admin.html.mako
+++ b/website/templates/emails/pending_registration_admin.html.mako
@@ -8,10 +8,10 @@
- The ${document_type} ${reviewable.title} has been successfully
+ The ${document_type} ${reviewable_title} has been successfully
resubmitted to ${reviewable.provider.name}.
@@ -20,7 +20,7 @@
for this ${document_type}.
- If you have been erroneously associated with "${reviewable.title}", then you may visit the ${document_type}'s
+ If you have been erroneously associated with "${reviewable_title}", then you may visit the ${document_type}'s
"Edit" page and remove yourself as a contributor.
- Your ${document_type} ${reviewable.title} has been successfully submitted to ${reviewable.provider.name}.
+ Your ${document_type} ${reviewable_title} has been successfully submitted to ${reviewable_provider_name}.
- ${reviewable.provider.name} has chosen to moderate their submissions using a pre-moderation workflow, which means your submission is pending until accepted by a moderator.
+ ${reviewable_provider_name} has chosen to moderate their submissions using a pre-moderation workflow, which means your submission is pending until accepted by a moderator.
You will receive a separate notification informing you of any status changes.
Sincerely,
- The ${reviewable.provider.name} and OSF teams.
+ The ${reviewable_provider_name} and OSF teams.
% else:
@@ -30,23 +28,23 @@
% if is_creator:
Your ${document_type}
- ${reviewable.title}
- has been successfully submitted to ${reviewable.provider.name}.
+ ${reviewable_title}
+ has been successfully submitted to ${reviewable_provider_name}.
% else:
${referrer.fullname} has added you as a contributor to the
${document_type}
- ${reviewable.title}
- on ${reviewable.provider.name}, which is hosted on the OSF.
+ ${reviewable_title}
+ on ${reviewable_provider_name}, which is hosted on the OSF.
% endif
% if workflow == 'pre-moderation':
- ${reviewable.provider.name} has chosen to moderate their submissions using a pre-moderation workflow,
+ ${reviewable_provider_name} has chosen to moderate their submissions using a pre-moderation workflow,
which means your submission is pending until accepted by a moderator.
% elif workflow == 'post-moderation':
- ${reviewable.provider.name} has chosen to moderate their submissions using a
+ ${reviewable_provider_name} has chosen to moderate their submissions using a
post-moderation workflow, which means your submission is public and discoverable,
while still pending acceptance by a moderator.
% else:
@@ -94,11 +92,11 @@
% if not is_creator:
- If you have been erroneously associated with "${reviewable.title}," then you may visit the ${document_type}
+ If you have been erroneously associated with "${reviewable_title}," then you may visit the ${document_type}
and remove yourself as a contributor.
% if document_type == 'registration':
% if is_rejected:
- Your submission ${reviewable.title}, submitted to ${reviewable.provider.name},
+ Your submission ${reviewable_title}, submitted to ${reviewable.provider.name},
has not been accepted. Your registration was returned as a draft so you can make the appropriate edits for resubmission.
- Click here to view your draft.
+ Click here to view your draft.
% else:
- Your submission ${reviewable.title}, submitted to ${reviewable.provider.name}, has been accepted by the moderator.
+ Your submission ${reviewable_title}, submitted to ${reviewable.provider.name}, has been accepted by the moderator.
% endif
% if notify_comment:
@@ -18,7 +18,7 @@
% endif
% else:
% if workflow == 'pre-moderation':
- Your submission ${reviewable.title}, submitted to ${reviewable.provider.name} has
+ Your submission ${reviewable_title}, submitted to ${reviewable.provider.name} has
% if is_rejected:
not been accepted. Contributors with admin permissions may edit the ${document_type} and
resubmit, at which time it will return to a pending state and be reviewed by a moderator.
@@ -26,7 +26,7 @@
been accepted by the moderator and is now discoverable to others.
% endif
% elif workflow == 'post-moderation':
- Your submission ${reviewable.title}, submitted to ${reviewable.provider.name} has
+ Your submission ${reviewable_title}, submitted to ${reviewable.provider.name} has
% if is_rejected:
not been accepted and will be made private and not discoverable by others.
Contributors with admin permissions may edit the ${document_type} and contact
@@ -93,7 +93,7 @@
% endif
% if not is_creator:
- If you have been erroneously associated with "${reviewable.title}," then you
+ If you have been erroneously associated with "${reviewable_title}," then you
may visit the project's "Contributors" page and remove yourself as a contributor.
- Your ${document_type} "${reviewable.title}" has an updated comment by the moderator:
+ Your ${document_type} "${reviewable_title}" has an updated comment by the moderator:
${comment}
@@ -12,7 +12,7 @@
email notification preferences, visit your user settings.
- If you have been erroneously associated with "${reviewable.title}", then you may visit the project's
+ If you have been erroneously associated with "${reviewable_title}", then you may visit the project's
"Contributors" page and remove yourself as a contributor.
- Your request to withdraw your registration "${reviewable.title}" from ${reviewable.provider.name} has been declined by the service moderators. The registration is still publicly available on ${reviewable.provider.name}.
+ Your request to withdraw your registration "${reviewable_title}" from ${reviewable.provider.name} has been declined by the service moderators. The registration is still publicly available on ${reviewable.provider.name}.
% if notify_comment:
The moderator has provided the following comment:
@@ -18,7 +18,7 @@
% else:
Dear ${requester_fullname},
- Your request to withdraw your ${document_type} "${reviewable.title}" from ${reviewable.provider.name} has been declined by the service moderators. Login and visit your ${document_type} to view their feedback. The ${document_type} is still publicly available on ${reviewable.provider.name}.
+ Your request to withdraw your ${document_type} "${reviewable_title}" from ${reviewable.provider.name} has been declined by the service moderators. Login and visit your ${document_type} to view their feedback. The ${document_type} is still publicly available on ${reviewable.provider.name}.
% endif
% if document_type == 'registration':
% if force_withdrawal:
- A moderator has withdrawn your ${document_type} "${reviewable.title}" from ${reviewable.provider.name}.
+ A moderator has withdrawn your ${document_type} "${reviewable_title}" from ${reviewable.provider.name}.
% else:
- Your request to withdraw your ${document_type} "${reviewable.title}" has been approved by ${reviewable.provider.name} moderators.
+ Your request to withdraw your ${document_type} "${reviewable_title}" has been approved by ${reviewable.provider.name} moderators.
% endif
% if notify_comment:
@@ -24,12 +24,12 @@
% else:
% if not ever_public:
% if is_requester:
- You have withdrawn your ${document_type} "${reviewable.title}" from ${reviewable.provider.name}.
+ You have withdrawn your ${document_type} "${reviewable_title}" from ${reviewable.provider.name}.
The ${document_type} has been removed from ${reviewable.provider.name}.
% else:
- ${requester_fullname} has withdrawn your ${document_type} "${reviewable.title}" from ${reviewable.provider.name}.
+ ${requester_fullname} has withdrawn your ${document_type} "${reviewable_title}" from ${reviewable.provider.name}.
% if reviewable.withdrawal_justification:
${requester_fullname} provided the following justification: "${reviewable.withdrawal_justification}"
% endif
@@ -39,12 +39,12 @@
% endif
% else:
% if is_requester:
- Your request to withdraw your ${document_type} "${reviewable.title}" from ${reviewable.provider.name} has been approved by the service moderators.
+ Your request to withdraw your ${document_type} "${reviewable_title}" from ${reviewable.provider.name} has been approved by the service moderators.
The ${document_type} has been removed from ${reviewable.provider.name}, but its metadata is still available: title of the withdrawn ${document_type}, its contributor list, abstract, tags, DOI, and reason for withdrawal (if provided).
% elif force_withdrawal:
- A moderator has withdrawn your ${document_type} "${reviewable.title}" from ${reviewable.provider.name}.
+ A moderator has withdrawn your ${document_type} "${reviewable_title}" from ${reviewable.provider.name}.
The ${document_type} has been removed from ${reviewable.provider.name}, but its metadata is still available: title of the withdrawn ${document_type}, its contributor list, abstract, tags, and DOI.
% if reviewable.withdrawal_justification:
@@ -53,7 +53,7 @@
% endif
% else:
- ${requester_fullname} has withdrawn your ${document_type} "${reviewable.title}" from ${reviewable.provider.name}.
+ ${requester_fullname} has withdrawn your ${document_type} "${reviewable_title}" from ${reviewable.provider.name}.
The ${document_type} has been removed from ${reviewable.provider.name}, but its metadata is still available: title of the withdrawn ${document_type}, its contributor list, abstract, tags, and DOI.
% if reviewable.withdrawal_justification:
From 7f6d5e96a77d5d57c4643579f54758a8480a7e21 Mon Sep 17 00:00:00 2001
From: ihorsokhanexoft
Date: Tue, 22 Jul 2025 21:07:39 +0300
Subject: [PATCH 101/336] [ENG-8401] Fixed preprint downloading (#11238)
* fixed preprint downloading
* fixed nonetype
---
addons/base/views.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/addons/base/views.py b/addons/base/views.py
index a6c90860b98..2c61fdda232 100644
--- a/addons/base/views.py
+++ b/addons/base/views.py
@@ -1007,7 +1007,8 @@ def persistent_file_download(auth, **kwargs):
if not file:
guid = Guid.load(id_or_guid)
if guid:
- file = guid.referent
+ referent = guid.referent
+ file = referent.primary_file if type(referent) is Preprint else referent
else:
raise HTTPError(http_status.HTTP_404_NOT_FOUND, data={
'message_short': 'File Not Found',
From 189854c80e0d87c41c66f8d85b8a3c9e4bbb740e Mon Sep 17 00:00:00 2001
From: ihorsokhanexoft
Date: Tue, 22 Jul 2025 21:10:04 +0300
Subject: [PATCH 102/336] [ENG-8216] Fixed children deletion on a node page in
admin (#11237)
* field children deletion in admin
* improved performance and removed unused attributes
---
admin/nodes/views.py | 12 +++++++-----
admin/templates/nodes/children.html | 25 ++++++++-----------------
2 files changed, 15 insertions(+), 22 deletions(-)
diff --git a/admin/nodes/views.py b/admin/nodes/views.py
index 71c3f60e965..74321c8f908 100644
--- a/admin/nodes/views.py
+++ b/admin/nodes/views.py
@@ -14,7 +14,6 @@
View,
FormView,
ListView,
- TemplateView,
)
from django.shortcuts import redirect, reverse, get_object_or_404
from django.urls import reverse_lazy
@@ -102,12 +101,16 @@ def get_context_data(self, **kwargs):
node = self.get_object()
detailed_duplicates = detect_duplicate_notifications(node_id=node.id)
-
+ children = node.get_nodes(is_node_link=False)
+ # Annotate guid because django templates prohibit accessing attributes that start with underscores
+ children = AbstractNode.objects.filter(
+ id__in=[child.id for child in children]
+ ).prefetch_related('guids').annotate(guid=F('guids___id'))
context.update({
'SPAM_STATUS': SpamStatus,
'STORAGE_LIMITS': settings.StorageLimits,
'node': node,
- 'children': node.get_nodes(is_node_link=False),
+ 'children': children,
'duplicates': detailed_duplicates
})
@@ -194,10 +197,9 @@ def add_contributor_removed_log(self, node, user):
).save()
-class NodeDeleteView(NodeMixin, TemplateView):
+class NodeDeleteView(NodeMixin, View):
""" Allows authorized users to mark nodes as deleted.
"""
- template_name = 'nodes/remove_node.html'
permission_required = ('osf.view_node', 'osf.delete_node')
raise_exception = True
diff --git a/admin/templates/nodes/children.html b/admin/templates/nodes/children.html
index 02f92a398ea..a24ba567c07 100644
--- a/admin/templates/nodes/children.html
+++ b/admin/templates/nodes/children.html
@@ -18,9 +18,7 @@
{% for child in children %}
Welcome to the Open Science Framework and the Election Research Preacceptance Competition. To continue, please verify your email address by visiting this link:
- You have been added by ${referrer.fullname}, as ${'an administrator' if is_admin else 'a moderator'} to ${provider.name}, powered by OSF. To set a password for your account, visit:
+ You have been added by ${referrer_fullname}, as ${'an administrator' if is_admin else 'a moderator'} to ${provider.name}, powered by OSF. To set a password for your account, visit:
${referrer_name + ' has added you' if referrer_name else 'You have been added'} as a contributor to the project "${node.title}" on the Open Science Framework: ${node.absolute_url}
You recently added ${fullname} to "${node.title}". ${fullname} wants to claim their account, but the email address they provided is different from the one you provided. To maintain security of your project, we are sending the account confirmation to you first.
@@ -20,7 +20,7 @@
Hello ${fullname},
- You have been added by ${referrer.fullname} as a contributor to the project "${node.title}" on the Open Science Framework. To set a password for your account, visit:
+ You have been added by ${referrer_fullname} as a contributor to the project "${node.title}" on the Open Science Framework. To set a password for your account, visit:
You recently added ${fullname} to "${node.title}". ${fullname} wants to claim their account, but the email address they provided is different from the one you provided. To maintain security of your project, we are sending the account confirmation to you first.
@@ -17,7 +17,7 @@
Hello ${fullname},
- You have been added by ${referrer.fullname} as a contributor to the project "${node.title}" on the Open Science Framework. To claim yourself as a contributor to the project, visit this url:
+ You have been added by ${referrer_fullname} as a contributor to the project "${node.title}" on the Open Science Framework. To claim yourself as a contributor to the project, visit this url:
- You have been added by ${referrer.fullname} as a contributor to the project "${node.title}" on the Open Science Framework. To set a password for your account, visit:
+ You have been added by ${referrer_fullname} as a contributor to the project "${node.title}" on the Open Science Framework. To set a password for your account, visit:
- ${referrer.fullname} has added you as a contributor on
+ ${referrer_fullname} has added you as a contributor on
% if not node.title or node.title == 'Untitled':
a new registration draft
% else:
diff --git a/website/templates/emails/invite_preprints.html.mako b/website/templates/emails/invite_preprints.html.mako
index 5a417a3c9b5..f389a6da918 100644
--- a/website/templates/emails/invite_preprints.html.mako
+++ b/website/templates/emails/invite_preprints.html.mako
@@ -8,7 +8,7 @@
%>
Hello ${fullname},
- You have been added by ${referrer.fullname} as a contributor to the ${branded_service.preprint_word} "${node.title}" on ${branded_service.name}, powered by the Open Science Framework. To set a password for your account, visit:
+ You have been added by ${referrer_fullname} as a contributor to the ${branded_service.preprint_word} "${node.title}" on ${branded_service.name}, powered by the Open Science Framework. To set a password for your account, visit:
- You have been added by ${referrer.fullname} as a contributor to the preprint "${node.title}" on the Open Science Framework. To set a password for your account, visit:
+ You have been added by ${referrer_fullname} as a contributor to the preprint "${node.title}" on the Open Science Framework. To set a password for your account, visit:
- You have been added by ${referrer.fullname} as ${'an administrator' if is_admin else 'a moderator'} to ${provider.name}, powered by OSF.
+ You have been added by ${referrer_fullname} as ${'an administrator' if is_admin else 'a moderator'} to ${provider.name}, powered by OSF.
You will automatically be subscribed to notification emails for new submissions to ${provider.name}.
diff --git a/website/templates/emails/pending_invite.html.mako b/website/templates/emails/pending_invite.html.mako
index 7d4e72017e5..7c2dcd91758 100644
--- a/website/templates/emails/pending_invite.html.mako
+++ b/website/templates/emails/pending_invite.html.mako
@@ -7,7 +7,7 @@
We received your request to claim an OSF account and become a contributor for "${node.title}".
- To confirm your identity, ${referrer.fullname} has been sent an email to forward to you with your confirmation link.
+ To confirm your identity, ${referrer_fullname} has been sent an email to forward to you with your confirmation link.
This link will allow you to complete your registration.
diff --git a/website/templates/emails/pending_registered.html.mako b/website/templates/emails/pending_registered.html.mako
index 36015a17d1a..4389500579b 100644
--- a/website/templates/emails/pending_registered.html.mako
+++ b/website/templates/emails/pending_registered.html.mako
@@ -7,7 +7,7 @@
We received your request to become a contributor for "${node.title}".
- To confirm your identity, ${referrer.fullname} has been sent an email to forward to you with your confirmation link.
+ To confirm your identity, ${referrer_fullname} has been sent an email to forward to you with your confirmation link.
This link will allow you to contribute to "${node.title}".
Your ${document_type} ${reviewable_title} has been successfully submitted to ${reviewable_provider_name}.
@@ -24,7 +24,7 @@
% else:
-
Hello ${user.fullname},
+
Hello ${user_fullname},
% if is_creator:
Your ${document_type}
@@ -33,7 +33,7 @@
% else:
- ${referrer.fullname} has added you as a contributor to the
+ ${referrer_fullname} has added you as a contributor to the
${document_type}
${reviewable_title}
on ${reviewable_provider_name}, which is hosted on the OSF.
diff --git a/website/templates/emails/reviews_submission_status.html.mako b/website/templates/emails/reviews_submission_status.html.mako
index a4b6c039656..bf3f0ca2bf7 100644
--- a/website/templates/emails/reviews_submission_status.html.mako
+++ b/website/templates/emails/reviews_submission_status.html.mako
@@ -1,15 +1,15 @@
## -*- coding: utf-8 -*-
<% from website import settings %>
-
Hello ${recipient.fullname},
+
Hello ${recipient_fullname},
% if document_type == 'registration':
% if is_rejected:
- Your submission ${reviewable_title}, submitted to ${reviewable.provider.name},
+ Your submission ${reviewable_title}, submitted to ${reviewable_provider_name},
has not been accepted. Your registration was returned as a draft so you can make the appropriate edits for resubmission.
Click here to view your draft.
% else:
- Your submission ${reviewable_title}, submitted to ${reviewable.provider.name}, has been accepted by the moderator.
+ Your submission ${reviewable_title}, submitted to ${reviewable_provider_name}, has been accepted by the moderator.
% endif
% if notify_comment:
@@ -18,7 +18,7 @@
% endif
% else:
% if workflow == 'pre-moderation':
- Your submission ${reviewable_title}, submitted to ${reviewable.provider.name} has
+ Your submission ${reviewable_title}, submitted to ${reviewable_provider_name} has
% if is_rejected:
not been accepted. Contributors with admin permissions may edit the ${document_type} and
resubmit, at which time it will return to a pending state and be reviewed by a moderator.
@@ -26,7 +26,7 @@
been accepted by the moderator and is now discoverable to others.
% endif
% elif workflow == 'post-moderation':
- Your submission ${reviewable_title}, submitted to ${reviewable.provider.name} has
+ Your submission ${reviewable_title}, submitted to ${reviewable_provider_name} has
% if is_rejected:
not been accepted and will be made private and not discoverable by others.
Contributors with admin permissions may edit the ${document_type} and contact
@@ -66,17 +66,17 @@
Sincerely,
- The ${reviewable.provider.name} and OSF teams
+ The ${reviewable_provider_name} and OSF teams
diff --git a/website/templates/emails/storage_cap_exceeded_announcement.html.mako b/website/templates/emails/storage_cap_exceeded_announcement.html.mako
index fe007e896da..5360012f11f 100644
--- a/website/templates/emails/storage_cap_exceeded_announcement.html.mako
+++ b/website/templates/emails/storage_cap_exceeded_announcement.html.mako
@@ -6,7 +6,7 @@
<%!
from website import settings
%>
- Hi ${user.given_name or user.fullname},
+ Hi ${user_fullname},
Thank you for storing your research materials on OSF. We have updated the OSF Storage capacity to 5 GB for private content and 50 GB for public content. None of your current files stored on OSF Storage will be affected, but after November 3, 2020 projects exceeding capacity will no longer accept new file uploads.
diff --git a/website/templates/emails/support_request.html.mako b/website/templates/emails/support_request.html.mako
index 5d2ad1794f6..e16bca8f346 100644
--- a/website/templates/emails/support_request.html.mako
+++ b/website/templates/emails/support_request.html.mako
@@ -3,11 +3,11 @@
<%def name="content()">
%def>
From ed5342c280d1652b8e69b9c537fe692ae66257ed Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 22 Jul 2025 15:20:59 -0400
Subject: [PATCH 104/336] add more provider notificationtypes
---
api/providers/serializers.py | 24 ++++-----
.../test_preprint_provider_moderator_list.py | 49 ++++++++++++-------
notifications.yaml | 10 +++-
3 files changed, 51 insertions(+), 32 deletions(-)
diff --git a/api/providers/serializers.py b/api/providers/serializers.py
index ef89388e281..b10f8290bd8 100644
--- a/api/providers/serializers.py
+++ b/api/providers/serializers.py
@@ -10,11 +10,10 @@
from api.preprints.serializers import PreprintProviderRelationshipField
from api.providers.workflows import Workflows
from api.base.metrics import MetricsSerializerMixin
-from osf.models import CitationStyle
+from osf.models import CitationStyle, NotificationType
from osf.models.user import Email, OSFUser
from osf.models.validators import validate_email
from osf.utils.permissions import REVIEW_GROUPS, ADMIN
-from website import mails
from website.settings import DOMAIN
@@ -313,12 +312,11 @@ def create(self, validated_data):
address = validated_data.pop('email', '')
provider = self.context['provider']
context = {
- 'referrer': auth.user,
+ 'referrer_fullname': auth.user.fullname,
}
if user_id and address:
raise ValidationError('Cannot specify both "id" and "email".')
- user = None
if user_id:
user = OSFUser.load(user_id)
elif address:
@@ -344,15 +342,15 @@ def create(self, validated_data):
if not user:
raise ValidationError('Unable to find specified user.')
- context['user'] = user
- context['provider'] = provider
+ context['user_fullname'] = user.fullname
+ context['provider_name'] = provider.name
if bool(get_perms(user, provider)):
raise ValidationError('Specified user is already a moderator.')
if 'claim_url' in context:
- template = mails.CONFIRM_EMAIL_MODERATION(provider)
+ template = NotificationType.Type.PROVIDER_CONFIRM_EMAIL_MODERATION
else:
- template = mails.MODERATOR_ADDED(provider)
+ template = NotificationType.Type.PROVIDER_MODERATOR_ADDED
perm_group = validated_data.pop('permission_group', '')
if perm_group not in REVIEW_GROUPS:
@@ -364,10 +362,12 @@ def create(self, validated_data):
provider.add_to_group(user, perm_group)
setattr(user, 'permission_group', perm_group) # Allows reserialization
- mails.send_mail(
- user.username,
- template,
- **context,
+ print(template, context)
+ NotificationType.objects.get(
+ name=template,
+ ).emit(
+ user=user,
+ event_context=context,
)
return user
diff --git a/api_tests/providers/preprints/views/test_preprint_provider_moderator_list.py b/api_tests/providers/preprints/views/test_preprint_provider_moderator_list.py
index 8998d2a85ca..ac075faddeb 100644
--- a/api_tests/providers/preprints/views/test_preprint_provider_moderator_list.py
+++ b/api_tests/providers/preprints/views/test_preprint_provider_moderator_list.py
@@ -1,11 +1,13 @@
import pytest
from api.base.settings.defaults import API_BASE
+from osf.models import NotificationType
from osf_tests.factories import (
AuthUserFactory,
PreprintProviderFactory,
)
from osf.utils import permissions
+from tests.utils import capture_notifications
@pytest.mark.usefixtures('mock_send_grid')
@@ -81,51 +83,60 @@ def test_list_post_unauthorized(self, mock_send_grid, app, url, nonmoderator, mo
assert mock_send_grid.call_count == 0
- def test_list_post_admin_success_existing_user(self, mock_send_grid, app, url, nonmoderator, moderator, admin, provider):
+ def test_list_post_admin_success_existing_user(self, app, url, nonmoderator, moderator, admin):
payload = self.create_payload(user_id=nonmoderator._id, permission_group='moderator')
- res = app.post_json_api(url, payload, auth=admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth)
assert res.status_code == 201
assert res.json['data']['id'] == nonmoderator._id
assert res.json['data']['attributes']['permission_group'] == 'moderator'
- assert mock_send_grid.call_count == 1
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
- def test_list_post_admin_failure_existing_moderator(self, mock_send_grid, app, url, moderator, admin, provider):
+ def test_list_post_admin_failure_existing_moderator(self, mock_send_grid, app, url, moderator, admin):
payload = self.create_payload(user_id=moderator._id, permission_group='moderator')
- res = app.post_json_api(url, payload, auth=admin.auth, expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth, expect_errors=True)
+ assert not notifications
assert res.status_code == 400
- assert mock_send_grid.call_count == 0
- def test_list_post_admin_failure_unreg_moderator(self, mock_send_grid, app, url, moderator, nonmoderator, admin, provider):
+ def test_list_post_admin_failure_unreg_moderator(self, app, url, moderator, nonmoderator, admin):
unreg_user = {'full_name': 'Son Goku', 'email': 'goku@dragonball.org'}
# test_user_with_no_moderator_admin_permissions
payload = self.create_payload(permission_group='moderator', **unreg_user)
res = app.post_json_api(url, payload, auth=nonmoderator.auth, expect_errors=True)
assert res.status_code == 403
- assert mock_send_grid.call_count == 0
# test_user_with_moderator_admin_permissions
payload = self.create_payload(permission_group='moderator', **unreg_user)
- res = app.post_json_api(url, payload, auth=admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth)
assert res.status_code == 201
- assert mock_send_grid.call_count == 1
- assert mock_send_grid.call_args[1]['to_addr'] == unreg_user['email']
+ assert len(notifications) == 1
+ assert notifications[0]['kwargs']['user'].username == unreg_user['email']
- def test_list_post_admin_failure_invalid_group(self, mock_send_grid, app, url, nonmoderator, moderator, admin, provider):
+ def test_list_post_admin_failure_invalid_group(self, app, url, nonmoderator, moderator, admin):
payload = self.create_payload(user_id=nonmoderator._id, permission_group='citizen')
- res = app.post_json_api(url, payload, auth=admin.auth, expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth, expect_errors=True)
assert res.status_code == 400
- assert mock_send_grid.call_count == 0
-
- def test_list_post_admin_success_email(self, mock_send_grid, app, url, nonmoderator, moderator, admin, provider):
- payload = self.create_payload(email='somenewuser@gmail.com', full_name='Some User', permission_group='moderator')
- res = app.post_json_api(url, payload, auth=admin.auth)
+ assert not notifications
+
+ def test_list_post_admin_success_email(self, app, url, nonmoderator, moderator, admin):
+ payload = self.create_payload(
+ email='somenewuser@gmail.com',
+ full_name='Some User',
+ permission_group='moderator'
+ )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth)
+ assert len(notifications) == 1
assert res.status_code == 201
assert len(res.json['data']['id']) == 5
assert res.json['data']['attributes']['permission_group'] == 'moderator'
assert 'email' not in res.json['data']['attributes']
- assert mock_send_grid.call_count == 1
def test_list_moderators_alphabetically(self, app, url, admin, moderator, provider):
admin.fullname = 'Alice Alisdottir'
diff --git a/notifications.yaml b/notifications.yaml
index 6bd704f69cc..0a18afdf681 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -132,8 +132,16 @@ notification_types:
template: 'website/templates/emails/contributor_added_preprints.html.mako'
- name: provider_reviews_submission_confirmation
__docs__: ...
- object_content_type_model_name: preprint
+ object_content_type_model_name: abstractprovider
template: 'website/templates/emails/reviews_submission_confirmation.html.mako'
+ - name: provider_confirm_email_moderation
+ __docs__: ...
+ object_content_type_model_name: abstractprovider
+ template: 'website/templates/emails/confirm_moderation.html.mako'
+ - name: provider_moderator_added
+ __docs__: ...
+ object_content_type_model_name: abstractprovider
+ template: 'website/templates/emails/moderator_added.html.mako'
#### NODE
- name: node_file_updated
From b3ca1b8f67746b986a570edd6329aacefb341bd7 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 22 Jul 2025 16:47:04 -0400
Subject: [PATCH 105/336] fix up user claim message notification tests
---
osf/models/notification.py | 6 +-
osf/models/notification_subscription.py | 2 +-
osf/models/notification_type.py | 12 +-
tests/test_adding_contributor_views.py | 460 +---------------------
tests/test_claim_views.py | 491 ++++++++++++++++++++++++
website/project/views/contributor.py | 6 +-
6 files changed, 511 insertions(+), 466 deletions(-)
create mode 100644 tests/test_claim_views.py
diff --git a/osf/models/notification.py b/osf/models/notification.py
index 5d339150111..4294eb797eb 100644
--- a/osf/models/notification.py
+++ b/osf/models/notification.py
@@ -18,13 +18,13 @@ class Notification(models.Model):
seen = models.DateTimeField(null=True, blank=True)
created = models.DateTimeField(auto_now_add=True)
- def send(self, protocol_type='email', recipient=None):
+ def send(self, protocol_type='email', destination_address=None):
if not settings.USE_EMAIL:
return
if not protocol_type == 'email':
raise NotImplementedError(f'Protocol type {protocol_type}. Email notifications are only implemented.')
- recipient_address = getattr(recipient, 'username', None) or self.subscription.user.username
+ recipient_address = destination_address or self.subscription.user.username
if protocol_type == 'email' and settings.DEV_MODE and settings.ENABLE_TEST_EMAIL:
email.send_email_over_smtp(
@@ -42,7 +42,7 @@ def send(self, protocol_type='email', recipient=None):
)
elif protocol_type == 'email':
email.send_email_with_send_grid(
- getattr(recipient, 'username', None) or self.subscription.user,
+ self.subscription.user,
self.subscription.notification_type,
self.event_context
)
diff --git a/osf/models/notification_subscription.py b/osf/models/notification_subscription.py
index a1c9467b50e..41b88ba9ea2 100644
--- a/osf/models/notification_subscription.py
+++ b/osf/models/notification_subscription.py
@@ -52,7 +52,7 @@ class Meta:
verbose_name = 'Notification Subscription'
verbose_name_plural = 'Notification Subscriptions'
- def emit(self, user, subscribed_object=None, event_context=None):
+ def emit(self, event_context=None):
"""Emit a notification to a user by creating Notification and NotificationSubscription objects.
Args:
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 14bfa97eac2..19fee3e10e8 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -195,11 +195,19 @@ def desk_types(cls):
help_text='Template used to render the subject line of email. Supports Django template syntax.'
)
- def emit(self, user, subscribed_object=None, message_frequency='instantly', event_context=None):
+ def emit(
+ self,
+ user,
+ destination_address=None,
+ subscribed_object=None,
+ message_frequency='instantly',
+ event_context=None
+ ):
"""Emit a notification to a user by creating Notification and NotificationSubscription objects.
Args:
user (OSFUser): The recipient of the notification.
+ destination_address (optional): For use in case where user's maybe using alternate email addresses.
subscribed_object (optional): The object the subscription is related to.
message_frequency (optional): Initializing message frequency.
event_context (dict, optional): Context for rendering the notification template.
@@ -216,7 +224,7 @@ def emit(self, user, subscribed_object=None, message_frequency='instantly', even
Notification.objects.create(
subscription=subscription,
event_context=event_context
- ).send()
+ ).send(destination_address=destination_address)
def add_user_to_subscription(self, user, *args, **kwargs):
"""
diff --git a/tests/test_adding_contributor_views.py b/tests/test_adding_contributor_views.py
index 003a8f886ad..6bbd70681b6 100644
--- a/tests/test_adding_contributor_views.py
+++ b/tests/test_adding_contributor_views.py
@@ -7,26 +7,18 @@
import pytest
from django.core.exceptions import ValidationError
-from flask import g
from pytest import approx
from rest_framework import status as http_status
from framework import auth
-from framework.auth import Auth, authenticate, cas
-from framework.auth.utils import impute_names_model
+from framework.auth import Auth
from framework.exceptions import HTTPError
-from framework.flask import redirect
-from osf.models import (
- OSFUser,
- Tag,
- NodeRelation,
-)
+from osf.models import NodeRelation
from osf.utils import permissions
from osf_tests.factories import (
fake_email,
AuthUserFactory,
NodeFactory,
- PreprintFactory,
ProjectFactory,
RegistrationProviderFactory,
UserFactory,
@@ -38,22 +30,16 @@
get_default_metaschema,
OsfTestCase,
)
-from tests.test_cas_authentication import generate_external_user_with_resp
-from website import settings
from website.profile.utils import add_contributor_json, serialize_unregistered
from website.project.signals import contributor_added
from website.project.views.contributor import (
deserialize_contributors,
notify_added_contributor,
send_claim_email,
- send_claim_registered_email,
)
-from website.util.metrics import OsfSourceTags, OsfClaimedTags, provider_source_tag, provider_claimed_tag
from conftest import start_mock_notification_send
@pytest.mark.enable_implicit_clean
-@mock.patch('website.mails.settings.USE_EMAIL', True)
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestAddingContributorViews(OsfTestCase):
def setUp(self):
@@ -433,8 +419,6 @@ def test_add_contribs_to_multiple_nodes(self):
assert child.contributors.count() == n_contributors_pre + len(payload['users'])
-@mock.patch('website.mails.settings.USE_EMAIL', True)
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestUserInviteViews(OsfTestCase):
def setUp(self):
@@ -443,8 +427,6 @@ def setUp(self):
self.project = ProjectFactory(creator=self.user)
self.invite_url = f'/api/v1/project/{self.project._primary_key}/invite_contributor/'
- self.mock_notification_send = start_mock_notification_send(self)
-
def test_invite_contributor_post_if_not_in_db(self):
name, email = fake.name(), fake_email()
res = self.app.post(
@@ -555,441 +537,3 @@ def test_send_claim_email_before_throttle_expires(self):
assert not self.mock_notification_send.called
-@pytest.mark.enable_implicit_clean
-@mock.patch('website.mails.settings.USE_EMAIL', True)
-@mock.patch('website.mails.settings.USE_CELERY', False)
-class TestClaimViews(OsfTestCase):
-
- def setUp(self):
- super().setUp()
- self.referrer = AuthUserFactory()
- self.project = ProjectFactory(creator=self.referrer, is_public=True)
- self.project_with_source_tag = ProjectFactory(creator=self.referrer, is_public=True)
- self.preprint_with_source_tag = PreprintFactory(creator=self.referrer, is_public=True)
- osf_source_tag, created = Tag.all_tags.get_or_create(name=OsfSourceTags.Osf.value, system=True)
- preprint_source_tag, created = Tag.all_tags.get_or_create(name=provider_source_tag(self.preprint_with_source_tag.provider._id, 'preprint'), system=True)
- self.project_with_source_tag.add_system_tag(osf_source_tag.name)
- self.preprint_with_source_tag.add_system_tag(preprint_source_tag.name)
- self.given_name = fake.name()
- self.given_email = fake_email()
- self.project_with_source_tag.add_unregistered_contributor(
- fullname=self.given_name,
- email=self.given_email,
- auth=Auth(user=self.referrer)
- )
- self.preprint_with_source_tag.add_unregistered_contributor(
- fullname=self.given_name,
- email=self.given_email,
- auth=Auth(user=self.referrer)
- )
- self.user = self.project.add_unregistered_contributor(
- fullname=self.given_name,
- email=self.given_email,
- auth=Auth(user=self.referrer)
- )
- self.project.save()
-
- self.mock_notification_send = start_mock_notification_send(self)
-
- @mock.patch('website.project.views.contributor.send_claim_email')
- def test_claim_user_already_registered_redirects_to_claim_user_registered(self, claim_email):
- name = fake.name()
- email = fake_email()
-
- # project contributor adds an unregistered contributor (without an email) on public project
- unregistered_user = self.project.add_unregistered_contributor(
- fullname=name,
- email=None,
- auth=Auth(user=self.referrer)
- )
- assert unregistered_user in self.project.contributors
-
- # unregistered user comes along and claims themselves on the public project, entering an email
- invite_url = self.project.api_url_for('claim_user_post', uid='undefined')
- self.app.post(invite_url, json={
- 'pk': unregistered_user._primary_key,
- 'value': email
- })
- assert claim_email.call_count == 1
-
- # set unregistered record email since we are mocking send_claim_email()
- unclaimed_record = unregistered_user.get_unclaimed_record(self.project._primary_key)
- unclaimed_record.update({'email': email})
- unregistered_user.save()
-
- # unregistered user then goes and makes an account with same email, before claiming themselves as contributor
- UserFactory(username=email, fullname=name)
-
- # claim link for the now registered email is accessed while not logged in
- token = unregistered_user.get_unclaimed_record(self.project._primary_key)['token']
- claim_url = f'/user/{unregistered_user._id}/{self.project._id}/claim/?token={token}'
- res = self.app.get(claim_url)
-
- # should redirect to 'claim_user_registered' view
- claim_registered_url = f'/user/{unregistered_user._id}/{self.project._id}/claim/verify/{token}/'
- assert res.status_code == 302
- assert claim_registered_url in res.headers.get('Location')
-
- @mock.patch('website.project.views.contributor.send_claim_email')
- def test_claim_user_already_registered_secondary_email_redirects_to_claim_user_registered(self, claim_email):
- name = fake.name()
- email = fake_email()
- secondary_email = fake_email()
-
- # project contributor adds an unregistered contributor (without an email) on public project
- unregistered_user = self.project.add_unregistered_contributor(
- fullname=name,
- email=None,
- auth=Auth(user=self.referrer)
- )
- assert unregistered_user in self.project.contributors
-
- # unregistered user comes along and claims themselves on the public project, entering an email
- invite_url = self.project.api_url_for('claim_user_post', uid='undefined')
- self.app.post(invite_url, json={
- 'pk': unregistered_user._primary_key,
- 'value': secondary_email
- })
- assert claim_email.call_count == 1
-
- # set unregistered record email since we are mocking send_claim_email()
- unclaimed_record = unregistered_user.get_unclaimed_record(self.project._primary_key)
- unclaimed_record.update({'email': secondary_email})
- unregistered_user.save()
-
- # unregistered user then goes and makes an account with same email, before claiming themselves as contributor
- registered_user = UserFactory(username=email, fullname=name)
- registered_user.emails.create(address=secondary_email)
- registered_user.save()
-
- # claim link for the now registered email is accessed while not logged in
- token = unregistered_user.get_unclaimed_record(self.project._primary_key)['token']
- claim_url = f'/user/{unregistered_user._id}/{self.project._id}/claim/?token={token}'
- res = self.app.get(claim_url)
-
- # should redirect to 'claim_user_registered' view
- claim_registered_url = f'/user/{unregistered_user._id}/{self.project._id}/claim/verify/{token}/'
- assert res.status_code == 302
- assert claim_registered_url in res.headers.get('Location')
-
- def test_claim_user_invited_with_no_email_posts_to_claim_form(self):
- given_name = fake.name()
- invited_user = self.project.add_unregistered_contributor(
- fullname=given_name,
- email=None,
- auth=Auth(user=self.referrer)
- )
- self.project.save()
-
- url = invited_user.get_claim_url(self.project._primary_key)
- res = self.app.post(url, data={
- 'password': 'bohemianrhap',
- 'password2': 'bohemianrhap'
- })
- assert res.status_code == 400
-
- def test_claim_user_post_with_registered_user_id(self):
- # registered user who is attempting to claim the unclaimed contributor
- reg_user = UserFactory()
- payload = {
- # pk of unreg user record
- 'pk': self.user._primary_key,
- 'claimerId': reg_user._primary_key
- }
- url = f'/api/v1/user/{self.user._primary_key}/{self.project._primary_key}/claim/email/'
- res = self.app.post(url, json=payload)
-
- # mail was sent
- assert self.mock_notification_send.call_count == 2
- # ... to the correct address
- referrer_call = self.mock_notification_send.call_args_list[0]
- claimer_call = self.mock_notification_send.call_args_list[1]
-
- assert referrer_call[1]['to_addr'] == self.referrer.email
- assert claimer_call[1]['to_addr'] == reg_user.email
-
- # view returns the correct JSON
- assert res.json == {
- 'status': 'success',
- 'email': reg_user.username,
- 'fullname': self.given_name,
- }
-
- def test_send_claim_registered_email(self):
- reg_user = UserFactory()
- send_claim_registered_email(
- claimer=reg_user,
- unclaimed_user=self.user,
- node=self.project
- )
- assert self.mock_notification_send.call_count == 2
- first_call_args = self.mock_notification_send.call_args_list[0][1]
- print(first_call_args)
- second_call_args = self.mock_notification_send.call_args_list[1][1]
- print(second_call_args)
-
- assert second_call_args['to_addr'] == reg_user.email
-
- def test_send_claim_registered_email_before_throttle_expires(self):
- reg_user = UserFactory()
- send_claim_registered_email(
- claimer=reg_user,
- unclaimed_user=self.user,
- node=self.project,
- )
- self.mock_notification_send.reset_mock()
- # second call raises error because it was called before throttle period
- with pytest.raises(HTTPError):
- send_claim_registered_email(
- claimer=reg_user,
- unclaimed_user=self.user,
- node=self.project,
- )
- assert not self.mock_notification_send.called
-
- @mock.patch('website.project.views.contributor.send_claim_registered_email')
- def test_claim_user_post_with_email_already_registered_sends_correct_email(
- self, send_claim_registered_email):
- reg_user = UserFactory()
- payload = {
- 'value': reg_user.username,
- 'pk': self.user._primary_key
- }
- url = self.project.api_url_for('claim_user_post', uid=self.user._id)
- self.app.post(url, json=payload)
- assert send_claim_registered_email.called
-
- def test_user_with_removed_unclaimed_url_claiming(self):
- """ Tests that when an unclaimed user is removed from a project, the
- unregistered user object does not retain the token.
- """
- self.project.remove_contributor(self.user, Auth(user=self.referrer))
-
- assert self.project._primary_key not in self.user.unclaimed_records.keys()
-
- def test_user_with_claim_url_cannot_claim_twice(self):
- """ Tests that when an unclaimed user is replaced on a project with a
- claimed user, the unregistered user object does not retain the token.
- """
- reg_user = AuthUserFactory()
-
- self.project.replace_contributor(self.user, reg_user)
-
- assert self.project._primary_key not in self.user.unclaimed_records.keys()
-
- def test_claim_user_form_redirects_to_password_confirm_page_if_user_is_logged_in(self):
- reg_user = AuthUserFactory()
- url = self.user.get_claim_url(self.project._primary_key)
- res = self.app.get(url, auth=reg_user.auth)
- assert res.status_code == 302
- res = self.app.get(url, auth=reg_user.auth, follow_redirects=True)
- token = self.user.get_unclaimed_record(self.project._primary_key)['token']
- expected = self.project.web_url_for(
- 'claim_user_registered',
- uid=self.user._id,
- token=token,
- )
- assert res.request.path == expected
-
- @mock.patch('framework.auth.cas.make_response_from_ticket')
- def test_claim_user_when_user_is_registered_with_orcid(self, mock_response_from_ticket):
- # TODO: check in qa url encoding
- token = self.user.get_unclaimed_record(self.project._primary_key)['token']
- url = f'/user/{self.user._id}/{self.project._id}/claim/verify/{token}/'
- # logged out user gets redirected to cas login
- res1 = self.app.get(url)
- assert res1.status_code == 302
- res = self.app.resolve_redirect(self.app.get(url))
- service_url = f'http://localhost{url}'
- expected = cas.get_logout_url(service_url=cas.get_login_url(service_url=service_url))
- assert res1.location == expected
-
- # user logged in with orcid automatically becomes a contributor
- orcid_user, validated_credentials, cas_resp = generate_external_user_with_resp(url)
- mock_response_from_ticket.return_value = authenticate(
- orcid_user,
- redirect(url)
- )
- orcid_user.set_unusable_password()
- orcid_user.save()
-
- # The request to OSF with CAS service ticket must not have cookie and/or auth.
- service_ticket = fake.md5()
- url_with_service_ticket = f'{url}?ticket={service_ticket}'
- res = self.app.get(url_with_service_ticket)
- # The response of this request is expected to be a 302 with `Location`.
- # And the redirect URL must equal to the originial service URL
- assert res.status_code == 302
- redirect_url = res.headers['Location']
- assert redirect_url == url
- # The response of this request is expected have the `Set-Cookie` header with OSF cookie.
- # And the cookie must belong to the ORCiD user.
- raw_set_cookie = res.headers['Set-Cookie']
- assert raw_set_cookie
- simple_cookie = SimpleCookie()
- simple_cookie.load(raw_set_cookie)
- cookie_dict = {key: value.value for key, value in simple_cookie.items()}
- osf_cookie = cookie_dict.get(settings.COOKIE_NAME, None)
- assert osf_cookie is not None
- user = OSFUser.from_cookie(osf_cookie)
- assert user._id == orcid_user._id
- # The ORCiD user must be different from the unregistered user created when the contributor was added
- assert user._id != self.user._id
-
- # Must clear the Flask g context manual and set the OSF cookie to context
- g.current_session = None
- self.app.set_cookie(settings.COOKIE_NAME, osf_cookie)
- res = self.app.resolve_redirect(res)
- assert res.status_code == 302
- assert self.project.is_contributor(orcid_user)
- assert self.project.url in res.headers.get('Location')
-
- def test_get_valid_form(self):
- url = self.user.get_claim_url(self.project._primary_key)
- res = self.app.get(url, follow_redirects=True)
- assert res.status_code == 200
-
- def test_invalid_claim_form_raise_400(self):
- uid = self.user._primary_key
- pid = self.project._primary_key
- url = f'/user/{uid}/{pid}/claim/?token=badtoken'
- res = self.app.get(url, follow_redirects=True)
- assert res.status_code == 400
-
- @mock.patch('osf.models.OSFUser.update_search_nodes')
- def test_posting_to_claim_form_with_valid_data(self, mock_update_search_nodes):
- url = self.user.get_claim_url(self.project._primary_key)
- res = self.app.post(url, data={
- 'username': self.user.username,
- 'password': 'killerqueen',
- 'password2': 'killerqueen'
- })
-
- assert res.status_code == 302
- location = res.headers.get('Location')
- assert 'login?service=' in location
- assert 'username' in location
- assert 'verification_key' in location
- assert self.project._primary_key in location
-
- self.user.reload()
- assert self.user.is_registered
- assert self.user.is_active
- assert self.project._primary_key not in self.user.unclaimed_records
-
- @mock.patch('osf.models.OSFUser.update_search_nodes')
- def test_posting_to_claim_form_removes_all_unclaimed_data(self, mock_update_search_nodes):
- # user has multiple unclaimed records
- p2 = ProjectFactory(creator=self.referrer)
- self.user.add_unclaimed_record(p2, referrer=self.referrer,
- given_name=fake.name())
- self.user.save()
- assert len(self.user.unclaimed_records.keys()) > 1 # sanity check
- url = self.user.get_claim_url(self.project._primary_key)
- res = self.app.post(url, data={
- 'username': self.given_email,
- 'password': 'bohemianrhap',
- 'password2': 'bohemianrhap'
- })
- self.user.reload()
- assert self.user.unclaimed_records == {}
-
- @mock.patch('osf.models.OSFUser.update_search_nodes')
- def test_posting_to_claim_form_sets_fullname_to_given_name(self, mock_update_search_nodes):
- # User is created with a full name
- original_name = fake.name()
- unreg = UnregUserFactory(fullname=original_name)
- # User invited with a different name
- different_name = fake.name()
- new_user = self.project.add_unregistered_contributor(
- email=unreg.username,
- fullname=different_name,
- auth=Auth(self.project.creator),
- )
- self.project.save()
- # Goes to claim url
- claim_url = new_user.get_claim_url(self.project._id)
- self.app.post(claim_url, data={
- 'username': unreg.username,
- 'password': 'killerqueen',
- 'password2': 'killerqueen'
- })
- unreg.reload()
- # Full name was set correctly
- assert unreg.fullname == different_name
- # CSL names were set correctly
- parsed_name = impute_names_model(different_name)
- assert unreg.given_name == parsed_name['given_name']
- assert unreg.family_name == parsed_name['family_name']
-
- def test_claim_user_post_returns_fullname(self):
- url = f'/api/v1/user/{self.user._primary_key}/{self.project._primary_key}/claim/email/'
- res = self.app.post(
- url,
- auth=self.referrer.auth,
- json={
- 'value': self.given_email,
- 'pk': self.user._primary_key
- },
- )
- assert res.json['fullname'] == self.given_name
- assert self.mock_notification_send.called
-
- def test_claim_user_post_if_email_is_different_from_given_email(self):
- email = fake_email() # email that is different from the one the referrer gave
- url = f'/api/v1/user/{self.user._primary_key}/{self.project._primary_key}/claim/email/'
- self.app.post(url, json={'value': email, 'pk': self.user._primary_key} )
- assert self.mock_notification_send.called
- assert self.mock_notification_send.call_count == 2
- call_to_invited = self.mock_notification_send.mock_calls[0]
- call_to_invited.assert_called_with(to_addr=email)
- call_to_referrer = self.mock_notification_send.mock_calls[1]
- call_to_referrer.assert_called_with(to_addr=self.given_email)
-
- def test_claim_url_with_bad_token_returns_400(self):
- url = self.project.web_url_for(
- 'claim_user_registered',
- uid=self.user._id,
- token='badtoken',
- )
- res = self.app.get(url, auth=self.referrer.auth)
- assert res.status_code == 400
-
- def test_cannot_claim_user_with_user_who_is_already_contributor(self):
- # user who is already a contirbutor to the project
- contrib = AuthUserFactory()
- self.project.add_contributor(contrib, auth=Auth(self.project.creator))
- self.project.save()
- # Claiming user goes to claim url, but contrib is already logged in
- url = self.user.get_claim_url(self.project._primary_key)
- res = self.app.get(
- url,
- auth=contrib.auth, follow_redirects=True)
- # Response is a 400
- assert res.status_code == 400
-
- def test_claim_user_with_project_id_adds_corresponding_claimed_tag_to_user(self):
- assert OsfClaimedTags.Osf.value not in self.user.system_tags
- url = self.user.get_claim_url(self.project_with_source_tag._primary_key)
- res = self.app.post(url, data={
- 'username': self.user.username,
- 'password': 'killerqueen',
- 'password2': 'killerqueen'
- })
-
- assert res.status_code == 302
- self.user.reload()
- assert OsfClaimedTags.Osf.value in self.user.system_tags
-
- def test_claim_user_with_preprint_id_adds_corresponding_claimed_tag_to_user(self):
- assert provider_claimed_tag(self.preprint_with_source_tag.provider._id, 'preprint') not in self.user.system_tags
- url = self.user.get_claim_url(self.preprint_with_source_tag._primary_key)
- res = self.app.post(url, data={
- 'username': self.user.username,
- 'password': 'killerqueen',
- 'password2': 'killerqueen'
- })
-
- assert res.status_code == 302
- self.user.reload()
- assert provider_claimed_tag(self.preprint_with_source_tag.provider._id, 'preprint') in self.user.system_tags
diff --git a/tests/test_claim_views.py b/tests/test_claim_views.py
new file mode 100644
index 00000000000..025aa1a53eb
--- /dev/null
+++ b/tests/test_claim_views.py
@@ -0,0 +1,491 @@
+import pytest
+from flask import g
+
+from http.cookies import SimpleCookie
+from unittest import mock
+
+from framework.auth import Auth, authenticate, cas
+from framework.auth.utils import impute_names_model
+from framework.exceptions import HTTPError
+from framework.flask import redirect
+from osf.models import (
+ OSFUser,
+ Tag, NotificationType,
+)
+from osf_tests.factories import (
+ fake_email,
+ AuthUserFactory,
+ PreprintFactory,
+ ProjectFactory,
+ UserFactory,
+ UnregUserFactory,
+)
+from tests.base import (
+ fake,
+ OsfTestCase,
+)
+from tests.test_cas_authentication import generate_external_user_with_resp
+from tests.utils import capture_notifications
+from website import settings
+from website.project.views.contributor import send_claim_registered_email
+from website.util.metrics import (
+ OsfSourceTags,
+ OsfClaimedTags,
+ provider_source_tag,
+ provider_claimed_tag
+)
+
+
+@pytest.mark.enable_implicit_clean
+class TestClaimViews(OsfTestCase):
+
+ def setUp(self):
+ super().setUp()
+ self.referrer = AuthUserFactory()
+ self.project = ProjectFactory(creator=self.referrer, is_public=True)
+ self.project_with_source_tag = ProjectFactory(creator=self.referrer, is_public=True)
+ self.preprint_with_source_tag = PreprintFactory(creator=self.referrer, is_public=True)
+ osf_source_tag, created = Tag.all_tags.get_or_create(name=OsfSourceTags.Osf.value, system=True)
+ preprint_source_tag, created = Tag.all_tags.get_or_create(name=provider_source_tag(self.preprint_with_source_tag.provider._id, 'preprint'), system=True)
+ self.project_with_source_tag.add_system_tag(osf_source_tag.name)
+ self.preprint_with_source_tag.add_system_tag(preprint_source_tag.name)
+ self.given_name = fake.name()
+ self.given_email = fake_email()
+ self.project_with_source_tag.add_unregistered_contributor(
+ fullname=self.given_name,
+ email=self.given_email,
+ auth=Auth(user=self.referrer)
+ )
+ self.preprint_with_source_tag.add_unregistered_contributor(
+ fullname=self.given_name,
+ email=self.given_email,
+ auth=Auth(user=self.referrer)
+ )
+ self.user = self.project.add_unregistered_contributor(
+ fullname=self.given_name,
+ email=self.given_email,
+ auth=Auth(user=self.referrer)
+ )
+ self.project.save()
+
+ def test_claim_user_already_registered_redirects_to_claim_user_registered(self):
+ name = fake.name()
+ email = fake_email()
+
+ # project contributor adds an unregistered contributor (without an email) on public project
+ unregistered_user = self.project.add_unregistered_contributor(
+ fullname=name,
+ email=None,
+ auth=Auth(user=self.referrer)
+ )
+ assert unregistered_user in self.project.contributors
+
+ # unregistered user comes along and claims themselves on the public project, entering an email
+ invite_url = self.project.api_url_for(
+ 'claim_user_post',
+ uid='undefined'
+ )
+ with capture_notifications() as notifications:
+ self.app.post(
+ invite_url,
+ json={
+ 'pk': unregistered_user._primary_key,
+ 'value': email
+ }
+ )
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.USER_PENDING_VERIFICATION
+ assert notifications[1]['type'] == NotificationType.Type.USER_FORWARD_INVITE
+
+ # set unregistered record email since we are mocking send_claim_email()
+ unclaimed_record = unregistered_user.get_unclaimed_record(self.project._primary_key)
+ unclaimed_record.update({'email': email})
+ unregistered_user.save()
+
+ # unregistered user then goes and makes an account with same email, before claiming themselves as contributor
+ UserFactory(username=email, fullname=name)
+
+ # claim link for the now registered email is accessed while not logged in
+ token = unregistered_user.get_unclaimed_record(self.project._primary_key)['token']
+ claim_url = f'/user/{unregistered_user._id}/{self.project._id}/claim/?token={token}'
+ res = self.app.get(claim_url)
+
+ # should redirect to 'claim_user_registered' view
+ claim_registered_url = f'/user/{unregistered_user._id}/{self.project._id}/claim/verify/{token}/'
+ assert res.status_code == 302
+ assert claim_registered_url in res.headers.get('Location')
+
+ def test_claim_user_already_registered_secondary_email_redirects_to_claim_user_registered(self):
+ name = fake.name()
+ email = fake_email()
+ secondary_email = fake_email()
+
+ # project contributor adds an unregistered contributor (without an email) on public project
+ unregistered_user = self.project.add_unregistered_contributor(
+ fullname=name,
+ email=None,
+ auth=Auth(user=self.referrer)
+ )
+ assert unregistered_user in self.project.contributors
+
+ # unregistered user comes along and claims themselves on the public project, entering an email
+ invite_url = self.project.api_url_for(
+ 'claim_user_post',
+ uid='undefined'
+ )
+ with capture_notifications() as notifications:
+ self.app.post(
+ invite_url,
+ json={
+ 'pk': unregistered_user._primary_key,
+ 'value': secondary_email
+ }
+ )
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.USER_PENDING_VERIFICATION
+ assert notifications[1]['type'] == NotificationType.Type.USER_FORWARD_INVITE
+
+ # set unregistered record email since we are mocking send_claim_email()
+ unclaimed_record = unregistered_user.get_unclaimed_record(self.project._primary_key)
+ unclaimed_record.update({'email': secondary_email})
+ unregistered_user.save()
+
+ # unregistered user then goes and makes an account with same email, before claiming themselves as contributor
+ registered_user = UserFactory(username=email, fullname=name)
+ registered_user.emails.create(address=secondary_email)
+ registered_user.save()
+
+ # claim link for the now registered email is accessed while not logged in
+ token = unregistered_user.get_unclaimed_record(self.project._primary_key)['token']
+ claim_url = f'/user/{unregistered_user._id}/{self.project._id}/claim/?token={token}'
+ res = self.app.get(claim_url)
+
+ # should redirect to 'claim_user_registered' view
+ claim_registered_url = f'/user/{unregistered_user._id}/{self.project._id}/claim/verify/{token}/'
+ assert res.status_code == 302
+ assert claim_registered_url in res.headers.get('Location')
+
+ def test_claim_user_invited_with_no_email_posts_to_claim_form(self):
+ given_name = fake.name()
+ invited_user = self.project.add_unregistered_contributor(
+ fullname=given_name,
+ email=None,
+ auth=Auth(user=self.referrer)
+ )
+ self.project.save()
+
+ url = invited_user.get_claim_url(self.project._primary_key)
+ res = self.app.post(url, data={
+ 'password': 'bohemianrhap',
+ 'password2': 'bohemianrhap'
+ })
+ assert res.status_code == 400
+
+ def test_claim_user_post_with_registered_user_id(self):
+ # registered user who is attempting to claim the unclaimed contributor
+ reg_user = UserFactory()
+ with capture_notifications() as notifications:
+ res = self.app.post(
+ f'/api/v1/user/{self.user._primary_key}/{self.project._primary_key}/claim/email/',
+ json={
+ # pk of unreg user record
+ 'pk': self.user._primary_key,
+ 'claimerId': reg_user._primary_key
+ }
+ )
+
+ # mail was sent
+ assert len(notifications) == 2
+ # ... to the correct address
+ assert notifications[0]['kwargs']['user'] == self.referrer
+ assert notifications[1]['kwargs']['user'] == reg_user
+
+ # view returns the correct JSON
+ assert res.json == {
+ 'status': 'success',
+ 'email': reg_user.username,
+ 'fullname': self.given_name,
+ }
+
+ def test_send_claim_registered_email(self):
+ reg_user = UserFactory()
+ with capture_notifications() as notifications:
+ send_claim_registered_email(
+ claimer=reg_user,
+ unclaimed_user=self.user,
+ node=self.project
+ )
+ assert len(notifications) == 2
+ # ... to the correct address
+ assert notifications[0]['kwargs']['user'] == self.referrer
+ assert notifications[1]['kwargs']['user'] == reg_user
+
+ def test_send_claim_registered_email_before_throttle_expires(self):
+ reg_user = UserFactory()
+ with capture_notifications() as notifications:
+ send_claim_registered_email(
+ claimer=reg_user,
+ unclaimed_user=self.user,
+ node=self.project,
+ )
+ # second call raises error because it was called before throttle period
+ with pytest.raises(HTTPError):
+ send_claim_registered_email(
+ claimer=reg_user,
+ unclaimed_user=self.user,
+ node=self.project,
+ )
+ assert not notifications
+
+ @mock.patch('website.project.views.contributor.send_claim_registered_email')
+ def test_claim_user_post_with_email_already_registered_sends_correct_email(
+ self, send_claim_registered_email):
+ reg_user = UserFactory()
+ payload = {
+ 'value': reg_user.username,
+ 'pk': self.user._primary_key
+ }
+ url = self.project.api_url_for('claim_user_post', uid=self.user._id)
+ self.app.post(url, json=payload)
+ assert send_claim_registered_email.called
+
+ def test_user_with_removed_unclaimed_url_claiming(self):
+ """ Tests that when an unclaimed user is removed from a project, the
+ unregistered user object does not retain the token.
+ """
+ self.project.remove_contributor(self.user, Auth(user=self.referrer))
+
+ assert self.project._primary_key not in self.user.unclaimed_records.keys()
+
+ def test_user_with_claim_url_cannot_claim_twice(self):
+ """ Tests that when an unclaimed user is replaced on a project with a
+ claimed user, the unregistered user object does not retain the token.
+ """
+ reg_user = AuthUserFactory()
+
+ self.project.replace_contributor(self.user, reg_user)
+
+ assert self.project._primary_key not in self.user.unclaimed_records.keys()
+
+ def test_claim_user_form_redirects_to_password_confirm_page_if_user_is_logged_in(self):
+ reg_user = AuthUserFactory()
+ url = self.user.get_claim_url(self.project._primary_key)
+ res = self.app.get(url, auth=reg_user.auth)
+ assert res.status_code == 302
+ res = self.app.get(url, auth=reg_user.auth, follow_redirects=True)
+ token = self.user.get_unclaimed_record(self.project._primary_key)['token']
+ expected = self.project.web_url_for(
+ 'claim_user_registered',
+ uid=self.user._id,
+ token=token,
+ )
+ assert res.request.path == expected
+
+ @mock.patch('framework.auth.cas.make_response_from_ticket')
+ def test_claim_user_when_user_is_registered_with_orcid(self, mock_response_from_ticket):
+ # TODO: check in qa url encoding
+ token = self.user.get_unclaimed_record(self.project._primary_key)['token']
+ url = f'/user/{self.user._id}/{self.project._id}/claim/verify/{token}/'
+ # logged out user gets redirected to cas login
+ res1 = self.app.get(url)
+ assert res1.status_code == 302
+ res = self.app.resolve_redirect(self.app.get(url))
+ service_url = f'http://localhost{url}'
+ expected = cas.get_logout_url(service_url=cas.get_login_url(service_url=service_url))
+ assert res1.location == expected
+
+ # user logged in with orcid automatically becomes a contributor
+ orcid_user, validated_credentials, cas_resp = generate_external_user_with_resp(url)
+ mock_response_from_ticket.return_value = authenticate(
+ orcid_user,
+ redirect(url)
+ )
+ orcid_user.set_unusable_password()
+ orcid_user.save()
+
+ # The request to OSF with CAS service ticket must not have cookie and/or auth.
+ service_ticket = fake.md5()
+ url_with_service_ticket = f'{url}?ticket={service_ticket}'
+ res = self.app.get(url_with_service_ticket)
+ # The response of this request is expected to be a 302 with `Location`.
+ # And the redirect URL must equal to the originial service URL
+ assert res.status_code == 302
+ redirect_url = res.headers['Location']
+ assert redirect_url == url
+ # The response of this request is expected have the `Set-Cookie` header with OSF cookie.
+ # And the cookie must belong to the ORCiD user.
+ raw_set_cookie = res.headers['Set-Cookie']
+ assert raw_set_cookie
+ simple_cookie = SimpleCookie()
+ simple_cookie.load(raw_set_cookie)
+ cookie_dict = {key: value.value for key, value in simple_cookie.items()}
+ osf_cookie = cookie_dict.get(settings.COOKIE_NAME, None)
+ assert osf_cookie is not None
+ user = OSFUser.from_cookie(osf_cookie)
+ assert user._id == orcid_user._id
+ # The ORCiD user must be different from the unregistered user created when the contributor was added
+ assert user._id != self.user._id
+
+ # Must clear the Flask g context manual and set the OSF cookie to context
+ g.current_session = None
+ self.app.set_cookie(settings.COOKIE_NAME, osf_cookie)
+ res = self.app.resolve_redirect(res)
+ assert res.status_code == 302
+ assert self.project.is_contributor(orcid_user)
+ assert self.project.url in res.headers.get('Location')
+
+ def test_get_valid_form(self):
+ url = self.user.get_claim_url(self.project._primary_key)
+ res = self.app.get(url, follow_redirects=True)
+ assert res.status_code == 200
+
+ def test_invalid_claim_form_raise_400(self):
+ uid = self.user._primary_key
+ pid = self.project._primary_key
+ url = f'/user/{uid}/{pid}/claim/?token=badtoken'
+ res = self.app.get(url, follow_redirects=True)
+ assert res.status_code == 400
+
+ @mock.patch('osf.models.OSFUser.update_search_nodes')
+ def test_posting_to_claim_form_with_valid_data(self, mock_update_search_nodes):
+ url = self.user.get_claim_url(self.project._primary_key)
+ res = self.app.post(url, data={
+ 'username': self.user.username,
+ 'password': 'killerqueen',
+ 'password2': 'killerqueen'
+ })
+
+ assert res.status_code == 302
+ location = res.headers.get('Location')
+ assert 'login?service=' in location
+ assert 'username' in location
+ assert 'verification_key' in location
+ assert self.project._primary_key in location
+
+ self.user.reload()
+ assert self.user.is_registered
+ assert self.user.is_active
+ assert self.project._primary_key not in self.user.unclaimed_records
+
+ @mock.patch('osf.models.OSFUser.update_search_nodes')
+ def test_posting_to_claim_form_removes_all_unclaimed_data(self, mock_update_search_nodes):
+ # user has multiple unclaimed records
+ p2 = ProjectFactory(creator=self.referrer)
+ self.user.add_unclaimed_record(p2, referrer=self.referrer,
+ given_name=fake.name())
+ self.user.save()
+ assert len(self.user.unclaimed_records.keys()) > 1 # sanity check
+ url = self.user.get_claim_url(self.project._primary_key)
+ res = self.app.post(url, data={
+ 'username': self.given_email,
+ 'password': 'bohemianrhap',
+ 'password2': 'bohemianrhap'
+ })
+ self.user.reload()
+ assert self.user.unclaimed_records == {}
+
+ @mock.patch('osf.models.OSFUser.update_search_nodes')
+ def test_posting_to_claim_form_sets_fullname_to_given_name(self, mock_update_search_nodes):
+ # User is created with a full name
+ original_name = fake.name()
+ unreg = UnregUserFactory(fullname=original_name)
+ # User invited with a different name
+ different_name = fake.name()
+ new_user = self.project.add_unregistered_contributor(
+ email=unreg.username,
+ fullname=different_name,
+ auth=Auth(self.project.creator),
+ )
+ self.project.save()
+ # Goes to claim url
+ claim_url = new_user.get_claim_url(self.project._id)
+ self.app.post(claim_url, data={
+ 'username': unreg.username,
+ 'password': 'killerqueen',
+ 'password2': 'killerqueen'
+ })
+ unreg.reload()
+ # Full name was set correctly
+ assert unreg.fullname == different_name
+ # CSL names were set correctly
+ parsed_name = impute_names_model(different_name)
+ assert unreg.given_name == parsed_name['given_name']
+ assert unreg.family_name == parsed_name['family_name']
+
+ def test_claim_user_post_returns_fullname(self):
+ with capture_notifications() as notifications:
+ res = self.app.post(
+ f'/api/v1/user/{self.user._primary_key}/{self.project._primary_key}/claim/email/',
+ auth=self.referrer.auth,
+ json={
+ 'value': self.given_email,
+ 'pk': self.user._primary_key
+ },
+ )
+ assert res.json['fullname'] == self.given_name
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_INVITE_DEFAULT
+
+ def test_claim_user_post_if_email_is_different_from_given_email(self):
+ email = fake_email() # email that is different from the one the referrer gave
+ with capture_notifications() as notifications:
+ self.app.post(
+ f'/api/v1/user/{self.user._primary_key}/{self.project._primary_key}/claim/email/',
+ json={
+ 'value': email,
+ 'pk': self.user._primary_key
+ }
+ )
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.USER_PENDING_VERIFICATION
+ assert notifications[0]['kwargs']['user'].username == self.given_email
+ assert notifications[1]['type'] == NotificationType.Type.USER_FORWARD_INVITE
+ assert notifications[1]['kwargs']['destination_address'] == email
+
+ def test_claim_url_with_bad_token_returns_400(self):
+ url = self.project.web_url_for(
+ 'claim_user_registered',
+ uid=self.user._id,
+ token='badtoken',
+ )
+ res = self.app.get(url, auth=self.referrer.auth)
+ assert res.status_code == 400
+
+ def test_cannot_claim_user_with_user_who_is_already_contributor(self):
+ # user who is already a contirbutor to the project
+ contrib = AuthUserFactory()
+ self.project.add_contributor(contrib, auth=Auth(self.project.creator))
+ self.project.save()
+ # Claiming user goes to claim url, but contrib is already logged in
+ url = self.user.get_claim_url(self.project._primary_key)
+ res = self.app.get(
+ url,
+ auth=contrib.auth, follow_redirects=True)
+ # Response is a 400
+ assert res.status_code == 400
+
+ def test_claim_user_with_project_id_adds_corresponding_claimed_tag_to_user(self):
+ assert OsfClaimedTags.Osf.value not in self.user.system_tags
+ url = self.user.get_claim_url(self.project_with_source_tag._primary_key)
+ res = self.app.post(url, data={
+ 'username': self.user.username,
+ 'password': 'killerqueen',
+ 'password2': 'killerqueen'
+ })
+
+ assert res.status_code == 302
+ self.user.reload()
+ assert OsfClaimedTags.Osf.value in self.user.system_tags
+
+ def test_claim_user_with_preprint_id_adds_corresponding_claimed_tag_to_user(self):
+ assert provider_claimed_tag(self.preprint_with_source_tag.provider._id, 'preprint') not in self.user.system_tags
+ url = self.user.get_claim_url(self.preprint_with_source_tag._primary_key)
+ res = self.app.post(url, data={
+ 'username': self.user.username,
+ 'password': 'killerqueen',
+ 'password2': 'killerqueen'
+ })
+
+ assert res.status_code == 302
+ self.user.reload()
+ assert provider_claimed_tag(self.preprint_with_source_tag.provider._id, 'preprint') in self.user.system_tags
diff --git a/website/project/views/contributor.py b/website/project/views/contributor.py
index 766ffb088e5..f3788f8b0c5 100644
--- a/website/project/views/contributor.py
+++ b/website/project/views/contributor.py
@@ -449,7 +449,8 @@ def send_claim_registered_email(claimer, unclaimed_user, node, throttle=24 * 360
event_context={
'claim_url': claim_url,
'fullname': unclaimed_record['name'],
- 'referrer': referrer.username,
+ 'referrer_username': referrer.username,
+ 'referrer_fullname': referrer.fullname,
'node': node.title,
'can_change_preferences': False,
'osf_contact_email': settings.OSF_CONTACT_EMAIL,
@@ -549,6 +550,7 @@ def send_claim_email(
NotificationType.objects.get(name=notification_type).emit(
user=referrer,
+ destination_address=email,
event_context={
'user': unclaimed_user.id,
'referrer': referrer.id,
@@ -992,7 +994,7 @@ def claim_user_post(node, **kwargs):
claimer = get_user(email=email)
# registered user
if claimer and claimer.is_registered:
- send_claim_registered_email(claimer, unclaimed_user, node)
+ send_claim_registered_email(claimer, unclaimed_user, node, email)
# unregistered user
else:
send_claim_email(email, unclaimed_user, node, notify=True)
From 2ac9ef942fd84246fbd744d03bb9f2fc58fdf568 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 22 Jul 2025 17:02:20 -0400
Subject: [PATCH 106/336] add bulk registration upload notifications to tests
---
api_tests/providers/tasks/test_bulk_upload.py | 66 +++++++++++++------
notifications.yaml | 16 +++++
2 files changed, 62 insertions(+), 20 deletions(-)
diff --git a/api_tests/providers/tasks/test_bulk_upload.py b/api_tests/providers/tasks/test_bulk_upload.py
index 221861ea313..8caf27d89bf 100644
--- a/api_tests/providers/tasks/test_bulk_upload.py
+++ b/api_tests/providers/tasks/test_bulk_upload.py
@@ -4,12 +4,14 @@
from api.providers.tasks import bulk_create_registrations
from osf.exceptions import RegistrationBulkCreationContributorError, RegistrationBulkCreationRowError
-from osf.models import RegistrationBulkUploadJob, RegistrationBulkUploadRow, RegistrationProvider, RegistrationSchema
+from osf.models import RegistrationBulkUploadJob, RegistrationBulkUploadRow, RegistrationProvider, RegistrationSchema, \
+ NotificationType
from osf.models.registration_bulk_upload_job import JobState
from osf.models.registration_bulk_upload_row import RegistrationBulkUploadContributors
from osf.utils.permissions import ADMIN, READ, WRITE
from osf_tests.factories import InstitutionFactory, SubjectFactory, UserFactory
+from tests.utils import capture_notifications
class TestRegistrationBulkUploadContributors:
@@ -317,10 +319,20 @@ def test_bulk_creation_dry_run(self, registration_row_1, registration_row_2, upl
assert upload_job_done_full.state == JobState.PICKED_UP
assert not upload_job_done_full.email_sent
- def test_bulk_creation_done_full(self, mock_send_grid, registration_row_1, registration_row_2,
- upload_job_done_full, provider, initiator, read_contributor, write_contributor):
-
- bulk_create_registrations(upload_job_done_full.id, dry_run=False)
+ def test_bulk_creation_done_full(
+ self,
+ registration_row_1,
+ registration_row_2,
+ upload_job_done_full,
+ provider,
+ initiator,
+ read_contributor,
+ write_contributor
+ ):
+ with capture_notifications() as notifications:
+ bulk_create_registrations(upload_job_done_full.id, dry_run=False)
+ notification_types = [notifications['type'] for notifications in notifications]
+ assert NotificationType.Type.USER_REGISTRATION_BULK_UPLOAD_SUCCESS_ALL in notification_types
upload_job_done_full.reload()
assert upload_job_done_full.state == JobState.DONE_FULL
assert upload_job_done_full.email_sent
@@ -335,13 +347,20 @@ def test_bulk_creation_done_full(self, mock_send_grid, registration_row_1, regis
assert row.draft_registration.contributor_set.get(user=write_contributor).permission == WRITE
assert row.draft_registration.contributor_set.get(user=read_contributor).permission == READ
- mock_send_grid.assert_called()
-
- def test_bulk_creation_done_partial(self, mock_send_grid, registration_row_3,
- registration_row_invalid_extra_bib_1, upload_job_done_partial,
- provider, initiator, read_contributor, write_contributor):
-
- bulk_create_registrations(upload_job_done_partial.id, dry_run=False)
+ def test_bulk_creation_done_partial(
+ self,
+ registration_row_3,
+ registration_row_invalid_extra_bib_1,
+ upload_job_done_partial,
+ provider,
+ initiator,
+ read_contributor,
+ write_contributor
+ ):
+ with capture_notifications() as notifications:
+ bulk_create_registrations(upload_job_done_partial.id, dry_run=False)
+ notification_types = [notifications['type'] for notifications in notifications]
+ assert NotificationType.Type.USER_REGISTRATION_BULK_UPLOAD_SUCCESS_PARTIAL in notification_types
upload_job_done_partial.reload()
assert upload_job_done_partial.state == JobState.DONE_PARTIAL
assert upload_job_done_partial.email_sent
@@ -355,16 +374,23 @@ def test_bulk_creation_done_partial(self, mock_send_grid, registration_row_3,
assert registration_row_3.draft_registration.contributor_set.get(user=write_contributor).permission == WRITE
assert registration_row_3.draft_registration.contributor_set.get(user=read_contributor).permission == READ
- mock_send_grid.assert_called()
+ def test_bulk_creation_done_error(
+ self,
+ registration_row_invalid_extra_bib_2,
+ registration_row_invalid_affiliation,
+ upload_job_done_error,
+ provider,
+ initiator,
+ read_contributor,
+ write_contributor,
+ institution
+ ):
+ with capture_notifications() as notifications:
+ bulk_create_registrations(upload_job_done_error.id, dry_run=False)
+ notification_types = [notifications['type'] for notifications in notifications]
+ assert NotificationType.Type.USER_REGISTRATION_BULK_UPLOAD_FAILURE_ALL in notification_types
- def test_bulk_creation_done_error(self, mock_send_grid, registration_row_invalid_extra_bib_2,
- registration_row_invalid_affiliation, upload_job_done_error,
- provider, initiator, read_contributor, write_contributor, institution):
-
- bulk_create_registrations(upload_job_done_error.id, dry_run=False)
upload_job_done_error.reload()
assert upload_job_done_error.state == JobState.DONE_ERROR
assert upload_job_done_error.email_sent
assert len(RegistrationBulkUploadRow.objects.filter(upload__id=upload_job_done_error.id)) == 0
-
- mock_send_grid.assert_called()
diff --git a/notifications.yaml b/notifications.yaml
index 0a18afdf681..61a146daffa 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -116,6 +116,22 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/forgot_password_institution.html.mako'
+ - name: user_registration_bulk_upload_success_all
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/registration_bulk_upload_success_all.html.mako'
+ - name: user_registration_bulk_upload_failure_all
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/registration_bulk_upload_failure_all.html.mako'
+ - name: user_registration_bulk_upload_success_partial
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/registration_bulk_upload_success_partial.html.mako'
+ - name: user_registration_bulk_upload_failure_duplicates
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/registration_bulk_upload_failure_duplicates.html.mako'
#### PROVIDER
- name: provider_new_pending_submissions
From 9815f1f8f6e352f8d3daffa3b2a7f3d06b7546f4 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 22 Jul 2025 17:14:42 -0400
Subject: [PATCH 107/336] fix issues with notifications when adding nodes to
instituions
---
.../test_institution_relationship_nodes.py | 71 +++++++++++--------
osf/utils/notifications.py | 1 +
website/reviews/listeners.py | 3 +-
3 files changed, 44 insertions(+), 31 deletions(-)
diff --git a/api_tests/institutions/views/test_institution_relationship_nodes.py b/api_tests/institutions/views/test_institution_relationship_nodes.py
index c62d760710d..c025407ab78 100644
--- a/api_tests/institutions/views/test_institution_relationship_nodes.py
+++ b/api_tests/institutions/views/test_institution_relationship_nodes.py
@@ -1,6 +1,7 @@
import pytest
from api.base.settings.defaults import API_BASE
+from osf.models import NotificationType
from osf_tests.factories import (
RegistrationFactory,
InstitutionFactory,
@@ -8,6 +9,7 @@
NodeFactory,
)
from osf.utils import permissions
+from tests.utils import capture_notifications
def make_payload(*node_ids):
@@ -372,45 +374,56 @@ def test_add_non_node(self, app, user, institution, url_institution_nodes):
assert res.status_code == 404
- def test_email_sent_on_affiliation_addition(self, app, user, institution, node_without_institution,
- url_institution_nodes, mock_send_grid):
+ def test_email_sent_on_affiliation_addition(
+ self,
+ app,
+ user,
+ institution,
+ node_without_institution,
+ url_institution_nodes,
+ ):
node_without_institution.add_contributor(user, permissions='admin')
current_institution = InstitutionFactory()
node_without_institution.affiliated_institutions.add(current_institution)
-
- res = app.post_json_api(
- url_institution_nodes,
- {
- 'data': [
- {
- 'type': 'nodes', 'id': node_without_institution._id
- }
- ]
- },
- auth=user.auth
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url_institution_nodes,
+ {
+ 'data': [
+ {
+ 'type': 'nodes', 'id': node_without_institution._id
+ }
+ ]
+ },
+ auth=user.auth
+ )
assert res.status_code == 201
- mock_send_grid.assert_called_once()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_AFFILIATION_CHANGED
- def test_email_sent_on_affiliation_removal(self, app, admin, institution, node_public, url_institution_nodes, mock_send_grid):
+ def test_email_sent_on_affiliation_removal(self, app, admin, institution, node_public, url_institution_nodes):
current_institution = InstitutionFactory()
node_public.affiliated_institutions.add(current_institution)
- res = app.delete_json_api(
- url_institution_nodes,
- {
- 'data': [
- {
- 'type': 'nodes', 'id': node_public._id
- }
- ]
- },
- auth=admin.auth
- )
+ with capture_notifications() as notifications:
+ res = app.delete_json_api(
+ url_institution_nodes,
+ {
+ 'data': [
+ {
+ 'type': 'nodes', 'id': node_public._id
+ }
+ ]
+ },
+ auth=admin.auth
+ )
# Assert response is successful
assert res.status_code == 204
- call_args = mock_send_grid.call_args[1]
- assert call_args['to_addr'] == admin.email
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.NODE_AFFILIATION_CHANGED
+ assert notifications[0]['kwargs']['user'] == node_public.creator
+ assert notifications[1]['type'] == NotificationType.Type.NODE_AFFILIATION_CHANGED
+ assert notifications[1]['kwargs']['user'] == admin
diff --git a/osf/utils/notifications.py b/osf/utils/notifications.py
index 78a422e4451..4f0c1a0dc05 100644
--- a/osf/utils/notifications.py
+++ b/osf/utils/notifications.py
@@ -120,6 +120,7 @@ def notify_moderator_registration_requests_withdrawal(resource, user, *args, **k
reviews_signals.reviews_withdraw_requests_notification_moderators.send(
timestamp=timezone.now(),
context=context,
+ resource=resource,
user=user
)
diff --git a/website/reviews/listeners.py b/website/reviews/listeners.py
index b00548b326b..3b6feeec3fc 100644
--- a/website/reviews/listeners.py
+++ b/website/reviews/listeners.py
@@ -135,14 +135,13 @@ def reviews_submit_notification_moderators(self, timestamp, resource, context, u
# Handle email notifications to notify moderators of new submissions.
@reviews_signals.reviews_withdraw_requests_notification_moderators.connect
-def reviews_withdraw_requests_notification_moderators(self, timestamp, context, user):
+def reviews_withdraw_requests_notification_moderators(self, timestamp, context, user, resource):
# imports moved here to avoid AppRegistryNotReady error
from osf.models import NotificationSubscriptionLegacy
from website.profile.utils import get_profile_image_url
from website.notifications.emails import store_emails
context['referrer_fullname'] = user.fullname
- resource = context['reviewable']
provider = resource.provider
# Get NotificationSubscription instance, which contains reference to all subscribers
From 7c08559ff5fed2265ab2474bbd74e77d8bc2282a Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 23 Jul 2025 10:23:14 -0400
Subject: [PATCH 108/336] fix issues with collection submissions notifications
---
osf/models/sanctions.py | 16 +--
osf_tests/test_collection_submission.py | 128 ++++++++++++++----------
2 files changed, 83 insertions(+), 61 deletions(-)
diff --git a/osf/models/sanctions.py b/osf/models/sanctions.py
index a4fcfe17396..a5b19f3a917 100644
--- a/osf/models/sanctions.py
+++ b/osf/models/sanctions.py
@@ -8,7 +8,6 @@
from framework.auth import Auth
from framework.exceptions import PermissionsError
from website import settings as osf_settings
-from website import mails
from osf.exceptions import (
InvalidSanctionRejectionToken,
InvalidSanctionApprovalToken,
@@ -404,7 +403,12 @@ def _rejection_url_context(self, user_id):
return None
def _send_approval_request_email(self, user, template, context):
- mails.send_mail(user.username, template, user=user, can_change_preferences=False, **context)
+ NotificationType.objects.get(
+ name=template
+ ).emit(
+ user=user,
+ event_context=context
+ )
def _email_template_context(self, user, node, is_authorizer=False):
return {}
@@ -781,8 +785,8 @@ class RegistrationApproval(SanctionCallbackMixin, EmailApprovableSanction):
DISPLAY_NAME = 'Approval'
SHORT_NAME = 'registration_approval'
- AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = mails.PENDING_REGISTRATION_ADMIN
- NON_AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = mails.PENDING_REGISTRATION_NON_ADMIN
+ AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = NotificationType.Type.NODE_PENDING_REGISTRATION_ADMIN
+ NON_AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = NotificationType.Type.NODE_PENDING_REGISTRATION_NON_ADMIN
AUTHORIZER_NOTIFY_EMAIL_TYPE = 'node_pending_registration_admin'
NON_AUTHORIZER_NOTIFY_EMAIL_TYPE = 'node_pending_registration_non_admin'
@@ -957,8 +961,8 @@ class EmbargoTerminationApproval(EmailApprovableSanction):
DISPLAY_NAME = 'Embargo Termination Request'
SHORT_NAME = 'embargo_termination_approval'
- AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = mails.PENDING_EMBARGO_TERMINATION_ADMIN
- NON_AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = mails.PENDING_EMBARGO_TERMINATION_NON_ADMIN
+ AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = NotificationType.Type.NODE_PENDING_EMBARGO_TERMINATION_ADMIN
+ NON_AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = NotificationType.Type.NODE_PENDING_EMBARGO_TERMINATION_NON_ADMIN
VIEW_URL_TEMPLATE = VIEW_PROJECT_URL_TEMPLATE
APPROVE_URL_TEMPLATE = osf_settings.DOMAIN + 'token_action/{node_id}/?token={token}'
diff --git a/osf_tests/test_collection_submission.py b/osf_tests/test_collection_submission.py
index 2ff2b279a6b..76baa2de752 100644
--- a/osf_tests/test_collection_submission.py
+++ b/osf_tests/test_collection_submission.py
@@ -1,4 +1,3 @@
-from unittest import mock
import pytest
from osf_tests.factories import (
@@ -9,13 +8,16 @@
from osf_tests.factories import NodeFactory, CollectionFactory, CollectionProviderFactory
-from osf.models import CollectionSubmission
+from osf.models import CollectionSubmission, NotificationType
from osf.utils.workflows import CollectionSubmissionStates
from framework.exceptions import PermissionsError
from api_tests.utils import UserRoles
from osf.management.commands.populate_collection_provider_notification_subscriptions import populate_collection_provider_notification_subscriptions
from django.utils import timezone
+from tests.utils import capture_notifications
+
+
@pytest.fixture
def user():
return AuthUserFactory()
@@ -144,7 +146,6 @@ def configure_test_auth(node, user_role, provider=None):
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestModeratedCollectionSubmission:
MOCK_NOW = timezone.now()
@@ -152,28 +153,27 @@ class TestModeratedCollectionSubmission:
@pytest.fixture(autouse=True)
def setup(self):
populate_collection_provider_notification_subscriptions()
- with mock.patch('osf.utils.machines.timezone.now', return_value=self.MOCK_NOW):
- yield
def test_submit(self, moderated_collection_submission):
# .submit on post_save
assert moderated_collection_submission.state == CollectionSubmissionStates.PENDING
- def test_notify_contributors_pending(self, node, moderated_collection, mock_send_grid):
- collection_submission = CollectionSubmission(
- guid=node.guids.first(),
- collection=moderated_collection,
- creator=node.creator,
- )
- collection_submission.save()
- assert mock_send_grid.called
+ def test_notify_contributors_pending(self, node, moderated_collection):
+ with capture_notifications() as notifications:
+ collection_submission = CollectionSubmission(
+ guid=node.guids.first(),
+ collection=moderated_collection,
+ creator=node.creator,
+ )
+ collection_submission.save()
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_SUBMITTED
+ assert notifications[1]['type'] == NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
assert collection_submission.state == CollectionSubmissionStates.PENDING
def test_notify_moderators_pending(self, node, moderated_collection):
- from website.notifications import emails
- store_emails = emails.store_emails
- with mock.patch('website.notifications.emails.store_emails') as mock_store_emails:
- mock_store_emails.side_effect = store_emails # implicitly test rendering
+
+ with capture_notifications() as notifications:
collection_submission = CollectionSubmission(
guid=node.guids.first(),
collection=moderated_collection,
@@ -181,18 +181,10 @@ def test_notify_moderators_pending(self, node, moderated_collection):
)
populate_collection_provider_notification_subscriptions()
collection_submission.save()
- assert mock_store_emails.called
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_SUBMITTED
+ assert notifications[1]['type'] == NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
assert collection_submission.state == CollectionSubmissionStates.PENDING
- email_call = mock_store_emails.call_args_list[0][0]
- moderator = moderated_collection.moderators.get()
- assert email_call == (
- [moderator._id],
- 'email_transactional',
- 'new_pending_submissions',
- collection_submission.creator,
- node,
- self.MOCK_NOW,
- )
@pytest.mark.parametrize('user_role', [UserRoles.UNAUTHENTICATED, UserRoles.NONCONTRIB])
def test_accept_fails(self, user_role, moderated_collection_submission):
@@ -206,10 +198,13 @@ def test_accept_success(self, node, moderated_collection_submission):
moderated_collection_submission.accept(user=moderator, comment='Test Comment')
assert moderated_collection_submission.state == CollectionSubmissionStates.ACCEPTED
- def test_notify_moderated_accepted(self, node, moderated_collection_submission, mock_send_grid):
+ def test_notify_moderated_accepted(self, node, moderated_collection_submission):
moderator = configure_test_auth(node, UserRoles.MODERATOR)
- moderated_collection_submission.accept(user=moderator, comment='Test Comment')
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ moderated_collection_submission.accept(user=moderator, comment='Test Comment')
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_ACCEPTED
+
assert moderated_collection_submission.state == CollectionSubmissionStates.ACCEPTED
@pytest.mark.parametrize('user_role', [UserRoles.UNAUTHENTICATED, UserRoles.NONCONTRIB])
@@ -224,11 +219,14 @@ def test_reject_success(self, node, moderated_collection_submission):
moderated_collection_submission.reject(user=moderator, comment='Test Comment')
assert moderated_collection_submission.state == CollectionSubmissionStates.REJECTED
- def test_notify_moderated_rejected(self, node, moderated_collection_submission, mock_send_grid):
+ def test_notify_moderated_rejected(self, node, moderated_collection_submission):
moderator = configure_test_auth(node, UserRoles.MODERATOR)
- moderated_collection_submission.reject(user=moderator, comment='Test Comment')
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ moderated_collection_submission.reject(user=moderator, comment='Test Comment')
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REJECTED
+
assert moderated_collection_submission.state == CollectionSubmissionStates.REJECTED
@pytest.mark.parametrize('user_role', UserRoles.excluding(*[UserRoles.ADMIN_USER, UserRoles.MODERATOR]))
@@ -248,20 +246,27 @@ def test_remove_success(self, node, user_role, moderated_collection_submission):
moderated_collection_submission.remove(user=user, comment='Test Comment')
assert moderated_collection_submission.state == CollectionSubmissionStates.REMOVED
- def test_notify_moderated_removed_moderator(self, node, moderated_collection_submission, mock_send_grid):
+ def test_notify_moderated_removed_moderator(self, node, moderated_collection_submission):
moderated_collection_submission.state_machine.set_state(CollectionSubmissionStates.ACCEPTED)
moderator = configure_test_auth(node, UserRoles.MODERATOR)
- moderated_collection_submission.remove(user=moderator, comment='Test Comment')
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ moderated_collection_submission.remove(user=moderator, comment='Test Comment')
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REMOVED_MODERATOR
+
assert moderated_collection_submission.state == CollectionSubmissionStates.REMOVED
- def test_notify_moderated_removed_admin(self, node, moderated_collection_submission, mock_send_grid):
+ def test_notify_moderated_removed_admin(self, node, moderated_collection_submission):
moderated_collection_submission.state_machine.set_state(CollectionSubmissionStates.ACCEPTED)
moderator = configure_test_auth(node, UserRoles.ADMIN_USER)
- moderated_collection_submission.remove(user=moderator, comment='Test Comment')
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ moderated_collection_submission.remove(user=moderator, comment='Test Comment')
+ assert len(notifications) == 2
+ assert notifications[1]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REMOVED_ADMIN
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REMOVED_ADMIN
+
assert moderated_collection_submission.state == CollectionSubmissionStates.REMOVED
def test_resubmit_success(self, node, moderated_collection_submission):
@@ -336,12 +341,15 @@ def test_remove_success(self, user_role, node, unmoderated_collection_submission
unmoderated_collection_submission.remove(user=user, comment='Test Comment')
assert unmoderated_collection_submission.state == CollectionSubmissionStates.REMOVED
- def test_notify_moderated_removed_admin(self, node, unmoderated_collection_submission, mock_send_grid):
+ def test_notify_moderated_removed_admin(self, node, unmoderated_collection_submission):
unmoderated_collection_submission.state_machine.set_state(CollectionSubmissionStates.ACCEPTED)
moderator = configure_test_auth(node, UserRoles.ADMIN_USER)
- unmoderated_collection_submission.remove(user=moderator, comment='Test Comment')
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ unmoderated_collection_submission.remove(user=moderator, comment='Test Comment')
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REMOVED_ADMIN
+ assert notifications[1]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REMOVED_ADMIN
assert unmoderated_collection_submission.state == CollectionSubmissionStates.REMOVED
def test_resubmit_success(self, node, unmoderated_collection_submission):
@@ -434,11 +442,13 @@ def test_accept_success(self, node, hybrid_moderated_collection_submission):
hybrid_moderated_collection_submission.accept(user=moderator, comment='Test Comment')
assert hybrid_moderated_collection_submission.state == CollectionSubmissionStates.ACCEPTED
- def test_notify_moderated_accepted(self, node, hybrid_moderated_collection_submission, mock_send_grid):
+ def test_notify_moderated_accepted(self, node, hybrid_moderated_collection_submission):
moderator = configure_test_auth(node, UserRoles.MODERATOR)
- hybrid_moderated_collection_submission.accept(user=moderator, comment='Test Comment')
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ hybrid_moderated_collection_submission.accept(user=moderator, comment='Test Comment')
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_ACCEPTED
assert hybrid_moderated_collection_submission.state == CollectionSubmissionStates.ACCEPTED
@pytest.mark.parametrize('user_role', [UserRoles.UNAUTHENTICATED, UserRoles.NONCONTRIB])
@@ -453,11 +463,13 @@ def test_reject_success(self, node, hybrid_moderated_collection_submission):
hybrid_moderated_collection_submission.reject(user=moderator, comment='Test Comment')
assert hybrid_moderated_collection_submission.state == CollectionSubmissionStates.REJECTED
- def test_notify_moderated_rejected(self, node, hybrid_moderated_collection_submission, mock_send_grid):
+ def test_notify_moderated_rejected(self, node, hybrid_moderated_collection_submission):
moderator = configure_test_auth(node, UserRoles.MODERATOR)
- hybrid_moderated_collection_submission.reject(user=moderator, comment='Test Comment')
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ hybrid_moderated_collection_submission.reject(user=moderator, comment='Test Comment')
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REJECTED
assert hybrid_moderated_collection_submission.state == CollectionSubmissionStates.REJECTED
@pytest.mark.parametrize('user_role', UserRoles.excluding(*[UserRoles.ADMIN_USER, UserRoles.MODERATOR]))
@@ -477,20 +489,26 @@ def test_remove_success(self, node, user_role, hybrid_moderated_collection_submi
hybrid_moderated_collection_submission.remove(user=user, comment='Test Comment')
assert hybrid_moderated_collection_submission.state == CollectionSubmissionStates.REMOVED
- def test_notify_moderated_removed_moderator(self, node, hybrid_moderated_collection_submission, mock_send_grid):
+ def test_notify_moderated_removed_moderator(self, node, hybrid_moderated_collection_submission):
hybrid_moderated_collection_submission.state_machine.set_state(CollectionSubmissionStates.ACCEPTED)
moderator = configure_test_auth(node, UserRoles.MODERATOR)
- hybrid_moderated_collection_submission.remove(user=moderator, comment='Test Comment')
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ hybrid_moderated_collection_submission.remove(user=moderator, comment='Test Comment')
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REMOVED_MODERATOR
assert hybrid_moderated_collection_submission.state == CollectionSubmissionStates.REMOVED
- def test_notify_moderated_removed_admin(self, node, hybrid_moderated_collection_submission, mock_send_grid):
+ def test_notify_moderated_removed_admin(self, node, hybrid_moderated_collection_submission):
hybrid_moderated_collection_submission.state_machine.set_state(CollectionSubmissionStates.ACCEPTED)
moderator = configure_test_auth(node, UserRoles.ADMIN_USER)
- hybrid_moderated_collection_submission.remove(user=moderator, comment='Test Comment')
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ hybrid_moderated_collection_submission.remove(user=moderator, comment='Test Comment')
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REMOVED_ADMIN
+ assert notifications[1]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REMOVED_ADMIN
+
assert hybrid_moderated_collection_submission.state == CollectionSubmissionStates.REMOVED
def test_resubmit_success(self, node, hybrid_moderated_collection_submission):
From 86a94666d3378760508bf31d2cb1c03f1b276add Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 23 Jul 2025 11:16:34 -0400
Subject: [PATCH 109/336] fix reviewable and contributor notifications
---
...est_collections_provider_moderator_list.py | 29 ++++++++++++-------
notifications.yaml | 8 +++++
osf/utils/notifications.py | 2 +-
osf_tests/test_reviewable.py | 19 +++++++-----
tests/test_adding_contributor_views.py | 19 +++++++-----
website/project/views/contributor.py | 7 ++++-
6 files changed, 56 insertions(+), 28 deletions(-)
diff --git a/api_tests/providers/collections/views/test_collections_provider_moderator_list.py b/api_tests/providers/collections/views/test_collections_provider_moderator_list.py
index 20d081e8709..5a7275158f2 100644
--- a/api_tests/providers/collections/views/test_collections_provider_moderator_list.py
+++ b/api_tests/providers/collections/views/test_collections_provider_moderator_list.py
@@ -1,12 +1,14 @@
import pytest
from api.base.settings.defaults import API_BASE
+from osf.models import NotificationType
from osf_tests.factories import (
AuthUserFactory,
CollectionProviderFactory,
)
from osf.utils import permissions
from osf_tests.utils import _ensure_subscriptions
+from tests.utils import capture_notifications
@pytest.fixture()
@@ -112,11 +114,13 @@ def test_POST_forbidden(self, mock_send_grid, app, url, nonmoderator, moderator,
def test_POST_admin_success_existing_user(self, mock_send_grid, app, url, nonmoderator, moderator, admin, provider):
payload = make_payload(user_id=nonmoderator._id, permission_group='moderator')
- res = app.post_json_api(url, payload, auth=admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert res.status_code == 201
assert res.json['data']['id'] == nonmoderator._id
assert res.json['data']['attributes']['permission_group'] == 'moderator'
- assert mock_send_grid.call_count == 1
def test_POST_admin_failure_existing_moderator(self, mock_send_grid, app, url, moderator, admin, provider):
payload = make_payload(user_id=moderator._id, permission_group='moderator')
@@ -124,21 +128,24 @@ def test_POST_admin_failure_existing_moderator(self, mock_send_grid, app, url, m
assert res.status_code == 400
assert mock_send_grid.call_count == 0
- def test_POST_admin_failure_unreg_moderator(self, mock_send_grid, app, url, moderator, nonmoderator, admin, provider):
+ def test_POST_admin_failure_unreg_moderator(self, app, url, moderator, nonmoderator, admin, provider):
unreg_user = {'full_name': 'Jalen Hurts', 'email': '1eagles@allbatman.org'}
# test_user_with_no_moderator_admin_permissions
payload = make_payload(permission_group='moderator', **unreg_user)
- res = app.post_json_api(url, payload, auth=nonmoderator.auth, expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=nonmoderator.auth, expect_errors=True)
+ assert not notifications
assert res.status_code == 403
- assert mock_send_grid.call_count == 0
# test_user_with_moderator_admin_permissions
payload = make_payload(permission_group='moderator', **unreg_user)
- res = app.post_json_api(url, payload, auth=admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth)
assert res.status_code == 201
- assert mock_send_grid.call_count == 1
- assert mock_send_grid.call_args[1]['to_addr'] == unreg_user['email']
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONFIRM_EMAIL_MODERATION
+ assert notifications[0]['kwargs']['user'].username == unreg_user['email']
def test_POST_admin_failure_invalid_group(self, mock_send_grid, app, url, nonmoderator, moderator, admin, provider):
payload = make_payload(user_id=nonmoderator._id, permission_group='citizen')
@@ -148,12 +155,14 @@ def test_POST_admin_failure_invalid_group(self, mock_send_grid, app, url, nonmod
def test_POST_admin_success_email(self, mock_send_grid, app, url, nonmoderator, moderator, admin, provider):
payload = make_payload(email='somenewuser@gmail.com', full_name='Some User', permission_group='moderator')
- res = app.post_json_api(url, payload, auth=admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONFIRM_EMAIL_MODERATION
assert res.status_code == 201
assert len(res.json['data']['id']) == 5
assert res.json['data']['attributes']['permission_group'] == 'moderator'
assert 'email' not in res.json['data']['attributes']
- assert mock_send_grid.call_count == 1
def test_moderators_alphabetically(self, app, url, admin, moderator, provider):
admin.fullname = 'Flecher Cox'
diff --git a/notifications.yaml b/notifications.yaml
index 61a146daffa..c5a3d7a6cb5 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -158,6 +158,14 @@ notification_types:
__docs__: ...
object_content_type_model_name: abstractprovider
template: 'website/templates/emails/moderator_added.html.mako'
+ - name: provider_reviews_submission_confirmation
+ __docs__: ...
+ object_content_type_model_name: abstractprovider
+ template: 'website/templates/emails/reviews_submission_confirmation.html.mako'
+ - name: provider_reviews_resubmission_confirmation
+ __docs__: ...
+ object_content_type_model_name: abstractprovider
+ template: 'website/templates/emails/reviews_resubmission_confirmation.html.mako'
#### NODE
- name: node_file_updated
diff --git a/osf/utils/notifications.py b/osf/utils/notifications.py
index 4f0c1a0dc05..b85db6532ac 100644
--- a/osf/utils/notifications.py
+++ b/osf/utils/notifications.py
@@ -60,7 +60,7 @@ def notify_resubmit(resource, user, *args, **kwargs):
reviews_signals.reviews_email_submit.send(
recipients=recipients,
context=context,
- template=mails.REVIEWS_RESUBMISSION_CONFIRMATION,
+ template=NotificationType.Type.PROVIDER_REVIEWS_RESUBMISSION_CONFIRMATION,
resource=resource,
)
reviews_signals.reviews_email_submit_moderators_notifications.send(
diff --git a/osf_tests/test_reviewable.py b/osf_tests/test_reviewable.py
index e3bc0b3d709..eb3783b71bc 100644
--- a/osf_tests/test_reviewable.py
+++ b/osf_tests/test_reviewable.py
@@ -1,13 +1,13 @@
from unittest import mock
import pytest
-from osf.models import Preprint
+from osf.models import Preprint, NotificationType
from osf.utils.workflows import DefaultStates
from osf_tests.factories import PreprintFactory, AuthUserFactory
+from tests.utils import capture_notifications
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestReviewable:
@mock.patch('website.identifiers.utils.request_identifiers')
@@ -34,23 +34,26 @@ def test_state_changes(self, _):
from_db.refresh_from_db()
assert from_db.machine_state == DefaultStates.ACCEPTED.value
- def test_reject_resubmission_sends_emails(self, mock_send_grid):
+ def test_reject_resubmission_sends_emails(self):
user = AuthUserFactory()
preprint = PreprintFactory(
reviews_workflow='pre-moderation',
is_published=False
)
assert preprint.machine_state == DefaultStates.INITIAL.value
- assert not mock_send_grid.call_count
- preprint.run_submit(user)
- assert mock_send_grid.call_count == 1
+ with capture_notifications() as notifications:
+ preprint.run_submit(user)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_REVIEWS_SUBMISSION_CONFIRMATION
assert preprint.machine_state == DefaultStates.PENDING.value
assert not user.notification_subscriptions.exists()
preprint.run_reject(user, 'comment')
assert preprint.machine_state == DefaultStates.REJECTED.value
- preprint.run_submit(user) # Resubmission alerts users and moderators
+ with capture_notifications() as notifications:
+ preprint.run_submit(user) # Resubmission alerts users and moderators
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_REVIEWS_RESUBMISSION_CONFIRMATION
assert preprint.machine_state == DefaultStates.PENDING.value
- assert mock_send_grid.call_count == 2
diff --git a/tests/test_adding_contributor_views.py b/tests/test_adding_contributor_views.py
index 6bbd70681b6..5825a0b42b5 100644
--- a/tests/test_adding_contributor_views.py
+++ b/tests/test_adding_contributor_views.py
@@ -13,7 +13,7 @@
from framework import auth
from framework.auth import Auth
from framework.exceptions import HTTPError
-from osf.models import NodeRelation
+from osf.models import NodeRelation, NotificationType
from osf.utils import permissions
from osf_tests.factories import (
fake_email,
@@ -30,6 +30,7 @@
get_default_metaschema,
OsfTestCase,
)
+from tests.utils import capture_notifications
from website.profile.utils import add_contributor_json, serialize_unregistered
from website.project.signals import contributor_added
from website.project.views.contributor import (
@@ -171,11 +172,10 @@ def test_add_contributor_with_unreg_contribs_and_reg_contribs(self):
assert rec['email'] == email
@mock.patch('website.project.views.contributor.send_claim_email')
- def test_add_contributors_post_only_sends_one_email_to_unreg_user(
- self, mock_send_claim_email):
+ def test_add_contributors_post_only_sends_one_email_to_unreg_user(self, mock_send_claim_email):
# Project has components
- comp1, comp2 = NodeFactory(
- creator=self.creator), NodeFactory(creator=self.creator)
+ comp1 = NodeFactory(creator=self.creator)
+ comp2 = NodeFactory(creator=self.creator)
NodeRelation.objects.create(parent=self.project, child=comp1)
NodeRelation.objects.create(parent=self.project, child=comp2)
self.project.save()
@@ -224,10 +224,13 @@ def test_add_contributors_post_only_sends_one_email_to_registered_user(self):
# send request
url = self.project.api_url_for('project_contributors_post')
assert self.project.can_edit(user=self.creator)
- self.app.post(url, json=payload, auth=self.creator.auth)
+ with capture_notifications() as notifications:
+ self.app.post(url, json=payload, auth=self.creator.auth)
+ assert len(notifications) == 3
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ assert notifications[1]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ assert notifications[2]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
- # send_mail should only have been called once
- assert self.mock_notification_send.call_count == 1
def test_add_contributors_post_sends_email_if_user_not_contributor_on_parent_node(self):
# Project has a component with a sub-component
diff --git a/website/project/views/contributor.py b/website/project/views/contributor.py
index f3788f8b0c5..ea4ec0f67be 100644
--- a/website/project/views/contributor.py
+++ b/website/project/views/contributor.py
@@ -471,7 +471,12 @@ def check_email_throttle_claim_email(node, contributor):
contributor.contributor_added_email_records[node._id] = {}
def send_claim_email(
- email, unclaimed_user, node, notify=True, throttle=24 * 3600, email_template='default'
+ email,
+ unclaimed_user,
+ node,
+ notify=True,
+ throttle=24 * 3600,
+ email_template='default'
):
"""
Send a claim email to an unregistered contributor or the referrer, depending on the scenario.
From 0e2f169d7e9dc3031fe82847feac06166067b656 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Thu, 24 Jul 2025 10:05:38 -0400
Subject: [PATCH 110/336] fix preprint versioning tests
---
framework/auth/views.py | 2 +-
osf/models/collection_submission.py | 1 +
osf/models/preprint.py | 6 +-
tests/test_misc_views.py | 9 +-
website/notifications/constants.py | 2 +-
website/notifications/emails.py | 19 +-
website/notifications/events/files.py | 26 +-
website/notifications/utils.py | 48 ++--
website/notifications/views.py | 48 ++--
website/reviews/listeners.py | 243 +++---------------
.../emails/new_pending_submissions.html.mako | 2 +-
11 files changed, 104 insertions(+), 302 deletions(-)
diff --git a/framework/auth/views.py b/framework/auth/views.py
index e00df8679cd..7e4cd6ad234 100644
--- a/framework/auth/views.py
+++ b/framework/auth/views.py
@@ -666,7 +666,7 @@ def external_login_confirm_email_get(auth, uid, token):
).emit(
user=user,
event_context={
- 'external_id_provider': provider.name,
+ 'external_id_provider': provider,
'can_change_preferences': False,
'osf_contact_email': settings.OSF_CONTACT_EMAIL,
},
diff --git a/osf/models/collection_submission.py b/osf/models/collection_submission.py
index 41b05862a02..c7f5e93b3e9 100644
--- a/osf/models/collection_submission.py
+++ b/osf/models/collection_submission.py
@@ -129,6 +129,7 @@ def _notify_moderators_pending(self, event_data):
subscribed_object=self.guid.referent,
event_context={
'submitter': self.creator.id,
+ 'requester_contributor_names': ''.join(self.guid.referent.contributors.values_list('fullname', flat=True))
},
)
diff --git a/osf/models/preprint.py b/osf/models/preprint.py
index 2dd469c5fa9..17e792e15aa 100644
--- a/osf/models/preprint.py
+++ b/osf/models/preprint.py
@@ -35,8 +35,6 @@
from osf.utils import sanitize
from osf.utils.permissions import ADMIN, WRITE
from osf.utils.requests import get_request_and_user_id, string_type_request_headers
-from website.notifications.emails import get_user_subscriptions
-from website.notifications import utils
from website.identifiers.clients import CrossRefClient, ECSArXivCrossRefClient
from website.project.licenses import set_license
from website.util import api_v2_url, api_url_for, web_url_for
@@ -1032,8 +1030,6 @@ def _add_creator_as_contributor(self):
def _send_preprint_confirmation(self, auth):
# Send creator confirmation email
recipient = self.creator
- event_type = utils.find_subscription_type('global_reviews')
- user_subscriptions = get_user_subscriptions(recipient, event_type)
if self.provider._id == 'osf':
logo = settings.OSF_PREPRINTS_LOGO
else:
@@ -1050,7 +1046,7 @@ def _send_preprint_confirmation(self, auth):
provider_id=self.provider._id if not self.provider.domain else '').strip('/'),
'provider_contact_email': self.provider.email_contact or settings.OSF_CONTACT_EMAIL,
'provider_support_email': self.provider.email_support or settings.OSF_SUPPORT_EMAIL,
- 'no_future_emails': user_subscriptions['none'],
+ 'no_future_emails': False,
'is_creator': True,
'provider_name': 'OSF Preprints' if self.provider.name == 'Open Science Framework' else self.provider.name,
'logo': logo,
diff --git a/tests/test_misc_views.py b/tests/test_misc_views.py
index 814ab0556f1..27c2a3e383c 100644
--- a/tests/test_misc_views.py
+++ b/tests/test_misc_views.py
@@ -21,7 +21,7 @@
Comment,
OSFUser,
SpamStatus,
- NodeRelation,
+ NodeRelation, NotificationType,
)
from osf.utils import permissions
from osf_tests.factories import (
@@ -50,6 +50,7 @@
from website.util import web_url_for
from website.util import rubeus
from conftest import start_mock_send_grid
+from tests.utils import capture_notifications
pytestmark = pytest.mark.django_db
@@ -426,13 +427,15 @@ def test_external_login_confirm_email_get_link(self):
self.user.save()
assert not self.user.is_registered
url = self.user.get_confirmation_url(self.user.username, external_id_provider='orcid', destination='dashboard')
- res = self.app.get(url)
+ with capture_notifications() as notification:
+ res = self.app.get(url)
+ assert len(notification) == 1
+ assert notification[0]['type'] == NotificationType.Type.USER_EXTERNAL_LOGIN_LINK_SUCCESS
assert res.status_code == 302, 'redirects to cas login'
assert 'You should be redirected automatically' in str(res.html)
assert '/login?service=' in res.location
assert 'new=true' not in parse.unquote(res.location)
- assert self.mock_send_grid.call_count == 1
self.user.reload()
assert self.user.external_identity['orcid'][self.provider_id] == 'VERIFIED'
diff --git a/website/notifications/constants.py b/website/notifications/constants.py
index ce3c9db4315..66bb575b765 100644
--- a/website/notifications/constants.py
+++ b/website/notifications/constants.py
@@ -1,5 +1,5 @@
NODE_SUBSCRIPTIONS_AVAILABLE = {
- 'file_updated': 'Files updated'
+ 'node_file_updated': 'Files updated'
}
# Note: if the subscription starts with 'global_', it will be treated like a default
diff --git a/website/notifications/emails.py b/website/notifications/emails.py
index d28352b2bdd..da2024e8e31 100644
--- a/website/notifications/emails.py
+++ b/website/notifications/emails.py
@@ -2,7 +2,7 @@
from babel import dates, core, Locale
-from osf.models import AbstractNode, NotificationSubscriptionLegacy
+from osf.models import AbstractNode, NotificationSubscription
from osf.models.notifications import NotificationDigest
from osf.utils.permissions import ADMIN, READ
from website import mails
@@ -13,7 +13,7 @@
def notify(event, user, node, timestamp, **context):
"""Retrieve appropriate ***subscription*** and passe user list
-
+website/notifications/u
:param event: event that triggered the notification
:param user: user who triggered notification
:param node: instance of Node
@@ -160,7 +160,10 @@ def check_node(node, event):
"""Return subscription for a particular node and event."""
node_subscriptions = {key: [] for key in constants.NOTIFICATION_TYPES}
if node:
- subscription = NotificationSubscriptionLegacy.load(utils.to_subscription_key(node._id, event))
+ subscription = NotificationSubscription.objects.filter(
+ node=node,
+ notification_type__name=event
+ )
for notification_type in node_subscriptions:
users = getattr(subscription, notification_type, [])
if users:
@@ -173,11 +176,11 @@ def check_node(node, event):
def get_user_subscriptions(user, event):
if user.is_disabled:
return {}
- user_subscription = NotificationSubscriptionLegacy.load(utils.to_subscription_key(user._id, event))
- if user_subscription:
- return {key: list(getattr(user_subscription, key).all().values_list('guids___id', flat=True)) for key in constants.NOTIFICATION_TYPES}
- else:
- return {key: [user._id] if (event in constants.USER_SUBSCRIPTIONS_AVAILABLE and key == 'email_transactional') else [] for key in constants.NOTIFICATION_TYPES}
+ user_subscription, _ = NotificationSubscription.objects.get_or_create(
+ user=user,
+ notification_type__name=event
+ )
+ return user_subscription
def get_node_lineage(node):
diff --git a/website/notifications/events/files.py b/website/notifications/events/files.py
index fdaabad0426..db8a9c91fdc 100644
--- a/website/notifications/events/files.py
+++ b/website/notifications/events/files.py
@@ -238,8 +238,13 @@ def perform(self):
return
# File
if self.payload['destination']['kind'] != 'folder':
- moved, warn, rm_users = event_utils.categorize_users(self.user, self.event_type, self.source_node,
- self.event_type, self.node)
+ moved, warn, rm_users = event_utils.categorize_users(
+ self.user,
+ self.event_type,
+ self.source_node,
+ self.event_type,
+ self.node
+ )
warn_message = f'{self.html_message} You are no longer tracking that file based on the settings you selected for the component.'
remove_message = (
f'{self.html_message} Your subscription has been removed due to '
@@ -248,11 +253,20 @@ def perform(self):
# Folder
else:
# Gets all the files in a folder to look for permissions conflicts
- files = event_utils.get_file_subs_from_folder(self.addon, self.user, self.payload['destination']['kind'],
- self.payload['destination']['path'],
- self.payload['destination']['name'])
+ files = event_utils.get_file_subs_from_folder(
+ self.addon,
+ self.user,
+ self.payload['destination']['kind'],
+ self.payload['destination']['path'],
+ self.payload['destination']['name']
+ )
# Bins users into different permissions
- moved, warn, rm_users = event_utils.compile_user_lists(files, self.user, self.source_node, self.node)
+ moved, warn, rm_users = event_utils.compile_user_lists(
+ files,
+ self.user,
+ self.source_node,
+ self.node
+ )
# For users that don't have individual file subscription but has permission on the new node
warn_message = f'{self.html_message} You are no longer tracking that folder or files within based on the settings you selected for the component.'
diff --git a/website/notifications/utils.py b/website/notifications/utils.py
index 51d487ff67a..e64d76c258f 100644
--- a/website/notifications/utils.py
+++ b/website/notifications/utils.py
@@ -1,9 +1,11 @@
import collections
from django.apps import apps
+from django.contrib.contenttypes.models import ContentType
from django.db.models import Q
from framework.postcommit_tasks.handlers import run_postcommit
+from osf.models import NotificationSubscription
from osf.utils.permissions import READ
from website.notifications import constants
from website.notifications.exceptions import InvalidSubscriptionError
@@ -144,22 +146,17 @@ def users_to_remove(source_event, source_node, new_node):
:param new_node: Node instance where a sub or new sub will be.
:return: Dict of notification type lists with user_ids
"""
- NotificationSubscriptionLegacy = apps.get_model('osf.NotificationSubscriptionLegacy')
removed_users = {key: [] for key in constants.NOTIFICATION_TYPES}
if source_node == new_node:
return removed_users
- old_sub = NotificationSubscriptionLegacy.load(to_subscription_key(source_node._id, source_event))
- old_node_sub = NotificationSubscriptionLegacy.load(to_subscription_key(source_node._id,
- '_'.join(source_event.split('_')[-2:])))
- if not old_sub and not old_node_sub:
- return removed_users
+ old_sub = NotificationSubscription.objects.get(
+ subscribed_object=source_node,
+ notification_type__name=source_event
+ )
for notification_type in constants.NOTIFICATION_TYPES:
users = []
if hasattr(old_sub, notification_type):
users += list(getattr(old_sub, notification_type).values_list('guids___id', flat=True))
- if hasattr(old_node_sub, notification_type):
- users += list(getattr(old_node_sub, notification_type).values_list('guids___id', flat=True))
- subbed, removed_users[notification_type] = separate_users(new_node, users)
return removed_users
@@ -449,7 +446,6 @@ def subscribe_user_to_notifications(node, user):
""" Update the notification settings for the creator or contributors
:param user: User to subscribe to notifications
"""
- NotificationSubscriptionLegacy = apps.get_model('osf.NotificationSubscriptionLegacy')
Preprint = apps.get_model('osf.Preprint')
DraftRegistration = apps.get_model('osf.DraftRegistration')
if isinstance(node, Preprint):
@@ -468,31 +464,19 @@ def subscribe_user_to_notifications(node, user):
raise InvalidSubscriptionError('Registrations are invalid targets for subscriptions')
events = constants.NODE_SUBSCRIPTIONS_AVAILABLE
- notification_type = 'email_transactional'
- target_id = node._id
if user.is_registered:
for event in events:
- event_id = to_subscription_key(target_id, event)
- global_event_id = to_subscription_key(user._id, 'global_' + event)
- global_subscription = NotificationSubscriptionLegacy.load(global_event_id)
-
- subscription = NotificationSubscriptionLegacy.load(event_id)
-
- # If no subscription for component and creator is the user, do not create subscription
- # If no subscription exists for the component, this means that it should adopt its
- # parent's settings
- if not (node and node.parent_node and not subscription and node.creator == user):
- if not subscription:
- subscription = NotificationSubscriptionLegacy(_id=event_id, owner=node, event_name=event)
- # Need to save here in order to access m2m fields
- subscription.save()
- if global_subscription:
- global_notification_type = get_global_notification_type(global_subscription, user)
- subscription.add_user_to_subscription(user, global_notification_type)
- else:
- subscription.add_user_to_subscription(user, notification_type)
- subscription.save()
+ subscription, _ = NotificationSubscription.objects.get_or_create(
+ user=user,
+ notification_type__name=event
+ )
+ subscription, _ = NotificationSubscription.objects.get_or_create(
+ user=user,
+ notification_type__name=event,
+ object_id=node.id,
+ content_type=ContentType.objects.get_for_model(node)
+ )
def format_user_and_project_subscriptions(user):
diff --git a/website/notifications/views.py b/website/notifications/views.py
index 1cbb62ee08d..09fb59a1260 100644
--- a/website/notifications/views.py
+++ b/website/notifications/views.py
@@ -6,8 +6,7 @@
from framework.auth.decorators import must_be_logged_in
from framework.exceptions import HTTPError
-from osf.models import AbstractNode, Registration
-from osf.models.notifications import NotificationSubscriptionLegacy
+from osf.models import AbstractNode, Registration, NotificationSubscription
from osf.utils.permissions import READ
from website.notifications import utils
from website.notifications.constants import NOTIFICATION_TYPES
@@ -69,7 +68,6 @@ def configure_subscription(auth):
f'{user!r} attempted to adopt_parent of a none node id, {target_id}'
)
raise HTTPError(http_status.HTTP_400_BAD_REQUEST)
- owner = user
else:
if not node.has_permission(user, READ):
sentry.log_message(f'{user!r} attempted to subscribe to private node, {target_id}')
@@ -81,40 +79,28 @@ def configure_subscription(auth):
)
raise HTTPError(http_status.HTTP_400_BAD_REQUEST)
- if notification_type != 'adopt_parent':
- owner = node
+ if 'file_updated' in event and len(event) > len('file_updated'):
+ pass
else:
- if 'file_updated' in event and len(event) > len('file_updated'):
- pass
- else:
- parent = node.parent_node
- if not parent:
- sentry.log_message(
- '{!r} attempted to adopt_parent of '
- 'the parentless project, {!r}'.format(user, node)
- )
- raise HTTPError(http_status.HTTP_400_BAD_REQUEST)
-
- # If adopt_parent make sure that this subscription is None for the current User
- subscription = NotificationSubscriptionLegacy.load(event_id)
- if not subscription:
- return {} # We're done here
-
- subscription.remove_user_from_subscription(user)
- return {}
-
- subscription = NotificationSubscriptionLegacy.load(event_id)
-
- if not subscription:
- subscription = NotificationSubscriptionLegacy(_id=event_id, owner=owner, event_name=event)
- subscription.save()
+ parent = node.parent_node
+ if not parent:
+ sentry.log_message(
+ '{!r} attempted to adopt_parent of '
+ 'the parentless project, {!r}'.format(user, node)
+ )
+ raise HTTPError(http_status.HTTP_400_BAD_REQUEST)
+
+ subscription, _ = NotificationSubscription.objects.get_or_create(
+ user=user,
+ subscribed_object=node,
+ notification_type__name=event
+ )
+ subscription.save()
if node and node._id not in user.notifications_configured:
user.notifications_configured[node._id] = True
user.save()
- subscription.add_user_to_subscription(user, notification_type)
-
subscription.save()
return {'message': f'Successfully subscribed to {notification_type} list on {event_id}'}
diff --git a/website/reviews/listeners.py b/website/reviews/listeners.py
index 3b6feeec3fc..616c95b4b2c 100644
--- a/website/reviews/listeners.py
+++ b/website/reviews/listeners.py
@@ -1,238 +1,53 @@
-from django.utils import timezone
-
-from osf.models import NotificationType
-from website.notifications import utils
+from django.contrib.contenttypes.models import ContentType
+from website.profile.utils import get_profile_image_url
+from osf.models import NotificationSubscription, NotificationType
+from website.settings import DOMAIN
from website.reviews import signals as reviews_signals
-from website.settings import OSF_PREPRINTS_LOGO, OSF_REGISTRIES_LOGO, DOMAIN
-
-
-@reviews_signals.reviews_email.connect
-def reviews_notification(self, creator, template, context, action):
- """
- Handle email notifications including: update comment, accept, and reject of submission, but not initial submission
- or resubmission.
- """
- # Avoid AppRegistryNotReady error
- from website.notifications.emails import notify_global_event
- recipients = list(action.target.contributors)
- time_now = action.created if action is not None else timezone.now()
- node = action.target
- notify_global_event(
- event='global_reviews',
- sender_user=creator,
- node=node,
- timestamp=time_now,
- recipients=recipients,
- template=template,
- context=context
- )
-
-
-@reviews_signals.reviews_email_submit.connect
-def reviews_submit_notification(self, recipients, context, resource, template=None):
- """
- Handle email notifications for a new submission or a resubmission
- """
- if not template:
- template = NotificationType.Type.PROVIDER_REVIEWS_SUBMISSION_CONFIRMATION
-
- # Avoid AppRegistryNotReady error
- from website.notifications.emails import get_user_subscriptions
- event_type = utils.find_subscription_type('global_reviews')
-
- provider = resource.provider
- if provider._id == 'osf':
- if provider.type == 'osf.preprintprovider':
- context['logo'] = OSF_PREPRINTS_LOGO
- elif provider.type == 'osf.registrationprovider':
- context['logo'] = OSF_REGISTRIES_LOGO
- else:
- raise NotImplementedError()
- else:
- context['logo'] = resource.provider._id
-
- for recipient in recipients:
- user_subscriptions = get_user_subscriptions(recipient, event_type)
- context['no_future_emails'] = user_subscriptions['none']
- context['is_creator'] = recipient == resource.creator
- context['provider_name'] = resource.provider.name
- NotificationType.objects.get(
- name=template
- ).emit(
- user=recipient,
- event_context=context
- )
-
-
-@reviews_signals.reviews_email_submit_moderators_notifications.connect
-def reviews_submit_notification_moderators(self, timestamp, resource, context, user):
- """
- Handle email notifications to notify moderators of new submissions or resubmission.
- """
- # imports moved here to avoid AppRegistryNotReady error
- from osf.models import NotificationSubscriptionLegacy
- from website.profile.utils import get_profile_image_url
- from website.notifications.emails import store_emails
-
- provider = resource.provider
-
- # Set submission url
- if provider.type == 'osf.preprintprovider':
- context['reviews_submission_url'] = (
- f'{DOMAIN}reviews/preprints/{provider._id}/{resource._id}'
- )
- elif provider.type == 'osf.registrationprovider':
- context['reviews_submission_url'] = f'{DOMAIN}{resource._id}?mode=moderator'
- else:
- raise NotImplementedError(f'unsupported provider type {provider.type}')
-
- # Set url for profile image of the submitter
- context['profile_image_url'] = get_profile_image_url(user)
-
- # Set message
- revision_id = context.get('revision_id')
- if revision_id:
- context['message'] = f'submitted updates to "{resource.title}".'
- context['reviews_submission_url'] += f'&revisionId={revision_id}'
- else:
- if context.get('resubmission'):
- context['message'] = f'resubmitted "{resource.title}".'
- else:
- context['message'] = f'submitted "{resource.title}".'
-
- # Get NotificationSubscription instance, which contains reference to all subscribers
- provider_subscription, created = NotificationSubscriptionLegacy.objects.get_or_create(
- _id=f'{provider._id}_new_pending_submissions',
- provider=provider
- )
- # "transactional" subscribers receive notifications "Immediately" (i.e. at 5 minute intervals)
- # "digest" subscribers receive emails daily
- recipients_per_subscription_type = {
- 'email_transactional': list(
- provider_subscription.email_transactional.all().values_list('guids___id', flat=True)
- ),
- 'email_digest': list(
- provider_subscription.email_digest.all().values_list('guids___id', flat=True)
- )
- }
-
- for subscription_type, recipient_ids in recipients_per_subscription_type.items():
- if not recipient_ids:
- continue
-
- store_emails(
- recipient_ids,
- subscription_type,
- 'new_pending_submissions',
- user,
- resource,
- timestamp,
- abstract_provider=provider,
- **context
- )
-
-# Handle email notifications to notify moderators of new submissions.
@reviews_signals.reviews_withdraw_requests_notification_moderators.connect
def reviews_withdraw_requests_notification_moderators(self, timestamp, context, user, resource):
- # imports moved here to avoid AppRegistryNotReady error
- from osf.models import NotificationSubscriptionLegacy
- from website.profile.utils import get_profile_image_url
- from website.notifications.emails import store_emails
context['referrer_fullname'] = user.fullname
-
provider = resource.provider
- # Get NotificationSubscription instance, which contains reference to all subscribers
- provider_subscription, created = NotificationSubscriptionLegacy.objects.get_or_create(
- _id=f'{provider._id}_new_pending_withdraw_requests',
- provider=provider
+ provider_subscription, _ = NotificationSubscription.objects.get_or_create(
+ notification_type__name=NotificationType.Type.PROVIDER_NEW_PENDING_WITHDRAW_REQUESTS,
+ object_id=provider.id,
+ content_type=ContentType.objects.get_for_model(provider.__class__),
)
- # Set message
context['message'] = f'has requested withdrawal of "{resource.title}".'
- # Set url for profile image of the submitter
context['profile_image_url'] = get_profile_image_url(user)
- # Set submission url
context['reviews_submission_url'] = f'{DOMAIN}reviews/registries/{provider._id}/{resource._id}'
- email_transactional_ids = list(provider_subscription.email_transactional.all().values_list('guids___id', flat=True))
- email_digest_ids = list(provider_subscription.email_digest.all().values_list('guids___id', flat=True))
-
- # Store emails to be sent to subscribers instantly (at a 5 min interval)
- store_emails(
- email_transactional_ids,
- 'email_transactional',
- 'new_pending_withdraw_requests',
- user,
- resource,
- timestamp,
- abstract_provider=provider,
- template='new_pending_submissions',
- **context
- )
+ for recipient in provider_subscription.preorint.moderators.all():
+ NotificationType.objects.get(
+ name=NotificationType.Type.PROVIDER_NEW_PENDING_WITHDRAW_REQUESTS
+ ).emit(
+ user=recipient,
+ event_context=context,
+ )
- # Store emails to be sent to subscribers daily
- store_emails(
- email_digest_ids,
- 'email_digest',
- 'new_pending_withdraw_requests',
- user,
- resource,
- timestamp,
- abstract_provider=provider,
- template='new_pending_submissions',
- **context
- )
-# Handle email notifications to notify moderators of new withdrawal requests
@reviews_signals.reviews_email_withdrawal_requests.connect
def reviews_withdrawal_requests_notification(self, timestamp, context):
- # imports moved here to avoid AppRegistryNotReady error
- from osf.models import NotificationSubscriptionLegacy
- from website.notifications.emails import store_emails
- from website.profile.utils import get_profile_image_url
- from website import settings
-
- # Get NotificationSubscription instance, which contains reference to all subscribers
- provider_subscription = NotificationSubscriptionLegacy.load(
- '{}_new_pending_submissions'.format(context['reviewable'].provider._id))
preprint = context['reviewable']
preprint_word = preprint.provider.preprint_word
- # Set message
+ provider_subscription, _ = NotificationSubscription.objects.get_or_create(
+ notification_type__name=NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS,
+ object_id=preprint.provider.id,
+ content_type=ContentType.objects.get_for_model(preprint.provider.__class__),
+ )
+
context['message'] = f'has requested withdrawal of the {preprint_word} "{preprint.title}".'
- # Set url for profile image of the submitter
context['profile_image_url'] = get_profile_image_url(context['requester'])
- # Set submission url
- context['reviews_submission_url'] = '{}reviews/preprints/{}/{}'.format(settings.DOMAIN,
- preprint.provider._id,
- preprint._id)
-
- email_transactional_ids = list(provider_subscription.email_transactional.all().values_list('guids___id', flat=True))
- email_digest_ids = list(provider_subscription.email_digest.all().values_list('guids___id', flat=True))
-
- # Store emails to be sent to subscribers instantly (at a 5 min interval)
- store_emails(
- email_transactional_ids,
- 'email_transactional',
- 'new_pending_submissions',
- context['requester'],
- preprint,
- timestamp,
- abstract_provider=preprint.provider,
- **context
- )
+ context['reviews_submission_url'] = f'{DOMAIN}reviews/preprints/{preprint.provider._id}/{preprint._id}'
- # Store emails to be sent to subscribers daily
- store_emails(
- email_digest_ids,
- 'email_digest',
- 'new_pending_submissions',
- context['requester'],
- preprint,
- timestamp,
- abstract_provider=preprint.provider,
- **context
- )
+ for recipient in provider_subscription.preorint.contributors.all():
+ NotificationType.objects.get(
+ name=NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
+ ).emit(
+ user=recipient,
+ event_context=context,
+ )
diff --git a/website/templates/emails/new_pending_submissions.html.mako b/website/templates/emails/new_pending_submissions.html.mako
index 067148e2437..46f6094276b 100644
--- a/website/templates/emails/new_pending_submissions.html.mako
+++ b/website/templates/emails/new_pending_submissions.html.mako
@@ -7,7 +7,7 @@
% if is_request_email:
${requester_fullname}
% else:
- ${', '.join(reviewable.contributors.values_list('fullname', flat=True))}
+ ${requester_contributor_names}
% endif
${message}
From 69b2d9024197ca6162cd5f3b8d6d99b749677b9c Mon Sep 17 00:00:00 2001
From: antkryt
Date: Thu, 24 Jul 2025 17:16:50 +0300
Subject: [PATCH 111/336] [ENG-7979] Registrations pending moderation that have
components also pending moderation do not display the children (#11222)
* show only public nodes for non-authorized users in the node queryset
* add custom filters to can_view(); include pending nodes for moderators in get_node_count()
* add test
---
api/base/views.py | 6 +--
api/nodes/serializers.py | 15 +++++++-
api/registrations/views.py | 14 +++++++
.../test_registrations_childrens_list.py | 38 +++++++++++++++++--
osf/models/node.py | 12 +++---
osf/models/provider.py | 6 +++
osf/models/registrations.py | 9 +----
osf/utils/workflows.py | 9 +++++
8 files changed, 88 insertions(+), 21 deletions(-)
diff --git a/api/base/views.py b/api/base/views.py
index aed2a033e0a..e0f5873fe2b 100644
--- a/api/base/views.py
+++ b/api/base/views.py
@@ -481,7 +481,7 @@ def get_ordering(self):
return self.default_ordering
# overrides GenericAPIView
- def get_queryset(self):
+ def get_queryset(self, *args, **kwargs):
"""
Returns non-deleted children of the current resource that the user has permission to view -
Children could be public, viewable through a view-only link (if provided), or the user
@@ -494,8 +494,8 @@ def get_queryset(self):
if self.request.query_params.get('sort', None) == '_order':
# Order by the order of the node_relations
order = Case(*[When(pk=pk, then=pos) for pos, pk in enumerate(node_pks)])
- return self.get_queryset_from_request().filter(pk__in=node_pks).can_view(auth.user, auth.private_link).order_by(order)
- return self.get_queryset_from_request().filter(pk__in=node_pks).can_view(auth.user, auth.private_link)
+ return self.get_queryset_from_request().filter(pk__in=node_pks).can_view(auth.user, auth.private_link, *args, **kwargs).order_by(order)
+ return self.get_queryset_from_request().filter(pk__in=node_pks).can_view(auth.user, auth.private_link, *args, **kwargs)
class BaseContributorDetail(JSONAPIBaseView, generics.RetrieveAPIView):
diff --git a/api/nodes/serializers.py b/api/nodes/serializers.py
index 341c589d8aa..f9e0aeed7a9 100644
--- a/api/nodes/serializers.py
+++ b/api/nodes/serializers.py
@@ -681,9 +681,22 @@ def get_node_count(self, obj):
AND UG.osfuser_id = %s)
)
)
+ OR (
+ osf_abstractnode.type = 'osf.registration'
+ AND osf_abstractnode.moderation_state IN ('pending', 'pending_withdraw', 'embargo', 'pending_embargo_termination')
+ AND EXISTS (
+ SELECT 1
+ FROM auth_permission AS P2
+ INNER JOIN osf_abstractprovidergroupobjectpermission AS G2 ON (P2.id = G2.permission_id)
+ INNER JOIN osf_osfuser_groups AS UG2 ON (G2.group_id = UG2.group_id)
+ WHERE P2.codename = 'view_submissions'
+ AND G2.content_object_id = osf_abstractnode.provider_id
+ AND UG2.osfuser_id = %s
+ )
+ )
OR (osf_privatelink.key = %s AND osf_privatelink.is_deleted = FALSE)
);
- """, [obj.id, obj.id, user_id, obj.id, user_id, auth.private_key],
+ """, [obj.id, obj.id, user_id, obj.id, user_id, user_id, auth.private_key],
)
return int(cursor.fetchone()[0])
diff --git a/api/registrations/views.py b/api/registrations/views.py
index a8d10d0602b..b2026d5f4b8 100644
--- a/api/registrations/views.py
+++ b/api/registrations/views.py
@@ -407,6 +407,20 @@ class RegistrationChildrenList(BaseChildrenList, generics.ListAPIView, Registrat
model_class = Registration
+ def get_queryset(self):
+ node = self.get_node()
+ auth = get_user_auth(self.request)
+ user = auth.user
+ provider = getattr(node, 'provider', None)
+ is_moderated = getattr(provider, 'is_reviewed', False)
+ custom_filters = {}
+
+ if is_moderated and user and user.is_authenticated and provider.is_moderator(user):
+ from osf.utils.workflows import RegistrationModerationStates
+ custom_filters['moderation_state__in'] = RegistrationModerationStates.in_moderation_states()
+
+ return super().get_queryset(**custom_filters)
+
class RegistrationCitationDetail(NodeCitationDetail, RegistrationMixin):
"""The documentation for this endpoint can be found [here](https://developer.osf.io/#operation/registrations_citations_list).
diff --git a/api_tests/registrations/views/test_registrations_childrens_list.py b/api_tests/registrations/views/test_registrations_childrens_list.py
index 67ff993fa2a..8c6646bdb80 100644
--- a/api_tests/registrations/views/test_registrations_childrens_list.py
+++ b/api_tests/registrations/views/test_registrations_childrens_list.py
@@ -5,9 +5,11 @@
NodeFactory,
ProjectFactory,
RegistrationFactory,
+ RegistrationProviderFactory,
AuthUserFactory,
PrivateLinkFactory,
)
+from osf.utils.workflows import RegistrationModerationStates
@pytest.fixture()
@@ -69,15 +71,13 @@ def test_registrations_children_list(self, user, app, registration_with_children
assert component_two._id in ids
def test_return_registrations_list_no_auth_approved(self, user, app, registration_with_children_approved, registration_with_children_approved_url):
- component_one, component_two, component_three, component_four = registration_with_children_approved.nodes
-
res = app.get(registration_with_children_approved_url)
ids = [node['id'] for node in res.json['data']]
assert res.status_code == 200
assert res.content_type == 'application/vnd.api+json'
- assert component_one._id in ids
- assert component_two._id in ids
+ for component in registration_with_children_approved.nodes:
+ assert component._id in ids
def test_registrations_list_no_auth_unapproved(self, user, app, registration_with_children, registration_with_children_url):
res = app.get(registration_with_children_url, expect_errors=True)
@@ -138,6 +138,36 @@ def test_registration_children_no_auth_vol(self, user, app, registration_with_ch
res = app.get(view_only_link_url, expect_errors=True)
assert res.status_code == 401
+ def test_registration_children_count_and_visibility_for_moderator(self, app, user):
+ non_contrib_moderator = AuthUserFactory()
+
+ # Setup provider and assign moderator permission
+ provider = RegistrationProviderFactory(reviews_workflow='pre-moderation')
+ provider.add_to_group(non_contrib_moderator, 'admin')
+ provider.save()
+
+ project = ProjectFactory(creator=user)
+ child = NodeFactory(parent=project, creator=user)
+
+ registration = RegistrationFactory(project=project, provider=provider)
+ registration.moderation_state = RegistrationModerationStates.PENDING.db_name
+ registration.save()
+
+ pending_child = RegistrationFactory(project=child, parent=registration, provider=provider)
+ pending_child.moderation_state = RegistrationModerationStates.PENDING.db_name
+ pending_child.save()
+
+ url = f'/v2/registrations/{registration._id}/children/'
+
+ res = app.get(url, auth=non_contrib_moderator.auth)
+ ids = [node['id'] for node in res.json['data']]
+ assert pending_child._id in ids
+
+ # Count should be 1
+ node_url = f'/v2/registrations/{registration._id}/?related_counts=children'
+ res = app.get(node_url, auth=non_contrib_moderator.auth)
+ assert res.json['data']['relationships']['children']['links']['related']['meta']['count'] == 1
+
@pytest.mark.django_db
class TestRegistrationChildrenListFiltering:
diff --git a/osf/models/node.py b/osf/models/node.py
index 34fa14f1f03..51fc26af43a 100644
--- a/osf/models/node.py
+++ b/osf/models/node.py
@@ -145,9 +145,7 @@ def get_children(self, root, active=False, include_root=False):
row.append(root.pk)
return AbstractNode.objects.filter(id__in=row)
- def can_view(self, user=None, private_link=None):
- qs = self.filter(is_public=True)
-
+ def can_view(self, user=None, private_link=None, **custom_filters):
if private_link is not None:
if isinstance(private_link, PrivateLink):
private_link = private_link.key
@@ -157,9 +155,12 @@ def can_view(self, user=None, private_link=None):
return self.filter(private_links__is_deleted=False, private_links__key=private_link).filter(
is_deleted=False)
+ # By default, only public nodes are shown. However, custom filters can be provided.
+ # This is useful when you want to display a specific subset of nodes unrelated to
+ # the current user (e.g. only `pending` nodes for moderators).
+ qs = self.filter(is_public=True) if not custom_filters else self.filter(**custom_filters)
if user is not None and not isinstance(user, AnonymousUser):
- read_user_query = get_objects_for_user(user, READ_NODE, self, with_superuser=False)
- qs |= read_user_query
+ qs |= get_objects_for_user(user, READ_NODE, self, with_superuser=False)
qs |= self.extra(where=["""
"osf_abstractnode".id in (
WITH RECURSIVE implicit_read AS (
@@ -179,6 +180,7 @@ def can_view(self, user=None, private_link=None):
) SELECT * FROM implicit_read
)
"""], params=(user.id,))
+
return qs.filter(is_deleted=False)
diff --git a/osf/models/provider.py b/osf/models/provider.py
index c78e2f52c94..aee5ae8fa56 100644
--- a/osf/models/provider.py
+++ b/osf/models/provider.py
@@ -352,6 +352,12 @@ def validate_schema(self, schema):
if not self.schemas.filter(id=schema.id).exists():
raise ValidationError('Invalid schema for provider.')
+ def is_moderator(self, user):
+ """Return True if the user is a moderator for this provider"""
+ if user and user.is_authenticated:
+ return user.has_perm('osf.view_submissions', self)
+ return False
+
class PreprintProvider(AbstractProvider):
"""
diff --git a/osf/models/registrations.py b/osf/models/registrations.py
index 3d3e967be30..d74260358f4 100644
--- a/osf/models/registrations.py
+++ b/osf/models/registrations.py
@@ -452,14 +452,7 @@ def can_view(self, auth):
if not auth or not auth.user or not self.is_moderated:
return False
- moderator_viewable_states = {
- RegistrationModerationStates.PENDING.db_name,
- RegistrationModerationStates.PENDING_WITHDRAW.db_name,
- RegistrationModerationStates.EMBARGO.db_name,
- RegistrationModerationStates.PENDING_EMBARGO_TERMINATION.db_name,
- }
- user_is_moderator = auth.user.has_perm('view_submissions', self.provider)
- if self.moderation_state in moderator_viewable_states and user_is_moderator:
+ if self.moderation_state in RegistrationModerationStates.in_moderation_states() and self.provider.is_moderator(auth.user):
return True
return False
diff --git a/osf/utils/workflows.py b/osf/utils/workflows.py
index b054de25452..f562ff0aab3 100644
--- a/osf/utils/workflows.py
+++ b/osf/utils/workflows.py
@@ -121,6 +121,15 @@ def from_sanction(cls, sanction):
return new_state
+ @classmethod
+ def in_moderation_states(cls):
+ return [
+ cls.PENDING.db_name,
+ cls.EMBARGO.db_name,
+ cls.PENDING_EMBARGO_TERMINATION.db_name,
+ cls.PENDING_WITHDRAW.db_name,
+ ]
+
class RegistrationModerationTriggers(ModerationEnum):
'''The acceptable 'triggers' to describe a moderated action on a Registration.'''
From 0139a5f0b794394d9a131ce36fc291f4fbb7c807 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Thu, 24 Jul 2025 10:17:12 -0400
Subject: [PATCH 112/336] update withdraw request declined notification
---
osf/utils/notifications.py | 17 +++++++++--------
website/reviews/listeners.py | 2 +-
.../withdrawal_request_declined.html.mako | 8 ++++----
3 files changed, 14 insertions(+), 13 deletions(-)
diff --git a/osf/utils/notifications.py b/osf/utils/notifications.py
index b85db6532ac..910421ab476 100644
--- a/osf/utils/notifications.py
+++ b/osf/utils/notifications.py
@@ -1,7 +1,6 @@
from django.utils import timezone
from osf.models.notification_type import NotificationType
-from website.mails import mails
from website.reviews import signals as reviews_signals
from website.settings import DOMAIN, OSF_SUPPORT_EMAIL, OSF_CONTACT_EMAIL
from osf.utils.workflows import RegistrationModerationTriggers
@@ -104,17 +103,19 @@ def notify_reject_withdraw_request(resource, action, *args, **kwargs):
context['requester_fullname'] = action.creator.fullname
for contributor in resource.contributors.all():
- context['contributor'] = contributor
+ context['contributor_fullname'] = contributor.fullname
context['requester_fullname'] = action.creator.fullname
context['is_requester'] = action.creator == contributor
-
- mails.send_mail(
- contributor.username,
- mails.WITHDRAWAL_REQUEST_DECLINED,
- **context
+ NotificationType.objects.get(
+ name=NotificationType.Type.PREPRINT_REQUEST_WITHDRAWAL_DECLINED
+ ).emit(
+ user=contributor,
+ event_context={
+ 'is_requester': contributor,
+ **context
+ },
)
-
def notify_moderator_registration_requests_withdrawal(resource, user, *args, **kwargs):
context = get_email_template_context(resource)
reviews_signals.reviews_withdraw_requests_notification_moderators.send(
diff --git a/website/reviews/listeners.py b/website/reviews/listeners.py
index 616c95b4b2c..52caa5fb3b0 100644
--- a/website/reviews/listeners.py
+++ b/website/reviews/listeners.py
@@ -11,7 +11,7 @@ def reviews_withdraw_requests_notification_moderators(self, timestamp, context,
provider = resource.provider
provider_subscription, _ = NotificationSubscription.objects.get_or_create(
- notification_type__name=NotificationType.Type.PROVIDER_NEW_PENDING_WITHDRAW_REQUESTS,
+ notification_type__name=NotificationType.Type.PROVIDER_REVIEWS_WITHDRAWAL_REQUESTED,
object_id=provider.id,
content_type=ContentType.objects.get_for_model(provider.__class__),
)
diff --git a/website/templates/emails/withdrawal_request_declined.html.mako b/website/templates/emails/withdrawal_request_declined.html.mako
index b24ddd861a1..4e63eed1b22 100644
--- a/website/templates/emails/withdrawal_request_declined.html.mako
+++ b/website/templates/emails/withdrawal_request_declined.html.mako
@@ -7,9 +7,9 @@
from website import settings
%>
% if document_type == 'registration':
- Dear ${contributor.fullname},
+ Dear ${contributor_fullname},
- Your request to withdraw your registration "${reviewable_title}" from ${reviewable.provider.name} has been declined by the service moderators. The registration is still publicly available on ${reviewable.provider.name}.
+ Your request to withdraw your registration "${reviewable_title}" from ${reviewable_provider_name} has been declined by the service moderators. The registration is still publicly available on ${reviewable_provider_name}.
% if notify_comment:
The moderator has provided the following comment:
@@ -18,10 +18,10 @@
% else:
Dear ${requester_fullname},
- Your request to withdraw your ${document_type} "${reviewable_title}" from ${reviewable.provider.name} has been declined by the service moderators. Login and visit your ${document_type} to view their feedback. The ${document_type} is still publicly available on ${reviewable.provider.name}.
+ Your request to withdraw your ${document_type} "${reviewable_title}" from ${reviewable_provider_name} has been declined by the service moderators. Login and visit your ${document_type} to view their feedback. The ${document_type} is still publicly available on ${reviewable_provider_name}.
% endif
Sincerely,
- The ${reviewable.provider.name} and OSF Teams
+ The ${reviewable_provider_name} and OSF Teams
%def>
From a8084f1a02a4d1cca6d619abff14c759c0080e7c Mon Sep 17 00:00:00 2001
From: ihorsokhanexoft
Date: Thu, 24 Jul 2025 20:35:35 +0300
Subject: [PATCH 113/336] added academiaInstitution in social-schema, fixed
True value of 'ongoing', fixed/added tests (#11239)
## Purpose
V2 API doesn't allow setting `True` value for `ongoing` property in employment/education tabs and set `academiaInstitution` property in social tab
## Changes
Added `academiaInstitution` field in social-schema
Fixed ignored required properties for `ongoing` property in employment/education-schema files
Added new tests and fixed the old ones
## QA Notes
Can be tested only via API
Updates can be viewed in user settings
## Ticket
https://openscience.atlassian.net/browse/ENG-8455
---
api/base/schemas/education-schema.json | 32 ++++++++------
api/base/schemas/employment-schema.json | 32 ++++++++------
api/base/schemas/social-schema.json | 4 ++
api_tests/users/views/test_user_detail.py | 54 +++++++++++++++--------
4 files changed, 78 insertions(+), 44 deletions(-)
diff --git a/api/base/schemas/education-schema.json b/api/base/schemas/education-schema.json
index 3dda2a3481f..f917e411629 100644
--- a/api/base/schemas/education-schema.json
+++ b/api/base/schemas/education-schema.json
@@ -25,18 +25,6 @@
"minimum": 1900
},
"ongoing": {
- "oneOf": [{
- "enum": [false],
- "required": ["startYear", "endYear"]
- },
- {
- "enum": [true],
- "required": ["startYear"],
- "not": {
- "required": ["endYear"]
- }
- }
- ],
"type": "boolean"
},
"department": {
@@ -56,7 +44,25 @@
"startMonth": ["startYear"],
"startYear": ["ongoing"],
"endYear": ["ongoing"],
- "endYear": ["startYear"]
+ "endYear": ["startYear"],
+ "ongoing": {
+ "if": {
+ "properties": {
+ "ongoing": {
+ "const": false
+ }
+ }
+ },
+ "then": {
+ "required": ["startYear", "endYear"]
+ },
+ "else": {
+ "required": ["startYear"],
+ "not": {
+ "required": ["endYear"]
+ }
+ }
+ }
}
}
}
\ No newline at end of file
diff --git a/api/base/schemas/employment-schema.json b/api/base/schemas/employment-schema.json
index f2e77fb5096..619e59b726f 100644
--- a/api/base/schemas/employment-schema.json
+++ b/api/base/schemas/employment-schema.json
@@ -25,18 +25,6 @@
"minimum": 1900
},
"ongoing": {
- "oneOf": [{
- "enum": [false],
- "required": ["startYear", "endYear"]
- },
- {
- "enum": [true],
- "required": ["startYear"],
- "not": {
- "required": ["endYear"]
- }
- }
- ],
"type": "boolean"
},
"department": {
@@ -56,7 +44,25 @@
"startMonth": ["startYear"],
"startYear": ["ongoing"],
"endYear": ["ongoing"],
- "endYear": ["startYear"]
+ "endYear": ["startYear"],
+ "ongoing": {
+ "if": {
+ "properties": {
+ "ongoing": {
+ "const": false
+ }
+ }
+ },
+ "then": {
+ "required": ["startYear", "endYear"]
+ },
+ "else": {
+ "required": ["startYear"],
+ "not": {
+ "required": ["endYear"]
+ }
+ }
+ }
}
}
}
\ No newline at end of file
diff --git a/api/base/schemas/social-schema.json b/api/base/schemas/social-schema.json
index 2e520c40a76..97b9360698d 100644
--- a/api/base/schemas/social-schema.json
+++ b/api/base/schemas/social-schema.json
@@ -64,6 +64,10 @@
"description": "The academiaProfileID for the given user",
"type": "string"
},
+ "academiaInstitution": {
+ "description": "The academiaInstitution for the given user",
+ "type": "string"
+ },
"orcid": {
"description": "The orcid for the given user",
"type": "string"
diff --git a/api_tests/users/views/test_user_detail.py b/api_tests/users/views/test_user_detail.py
index 02a616bc4c4..cdfc5599ddd 100644
--- a/api_tests/users/views/test_user_detail.py
+++ b/api_tests/users/views/test_user_detail.py
@@ -935,7 +935,8 @@ def test_patch_all_social_fields(self, app, user_one, url_user_one, mock_spam_he
'impactStory': 'why not',
'orcid': 'ork-id',
'researchGate': 'Why are there so many of these',
- 'researcherId': 'ok-lastone'
+ 'researcherId': 'ok-lastone',
+ 'academiaInstitution': 'Center for Open Science'
}
fake_fields = {
@@ -1354,30 +1355,47 @@ def test_user_put_profile_date_validate_end_date(self, app, user_one, user_one_u
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'End date must be greater than or equal to the start date.'
- def test_user_put_profile_date_validate_end_month_dependency(self, app, user_one, user_one_url, end_month_dependency_payload):
- # No endMonth with endYear
- res = app.put_json_api(user_one_url, end_month_dependency_payload, auth=user_one.auth, expect_errors=True)
+ def test_user_put_profile_date_validate_end_month_dependency_ongoing(self, app, user_one, user_attr, user_one_url, start_dates_no_end_dates_payload, request_key):
+ # End dates, but no start dates
+ start_dates_no_end_dates_payload['data']['attributes'][request_key][0]['ongoing'] = True
+ start_dates_no_end_dates_payload['data']['attributes'][request_key][0]['endMonth'] = 3
+
+ res = app.put_json_api(user_one_url, start_dates_no_end_dates_payload, auth=user_one.auth, expect_errors=True)
+ user_one.reload()
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == "'endYear' is a dependency of 'endMonth'"
- def test_user_put_profile_date_validate_start_month_dependency(self, app, user_one, user_one_url, start_month_dependency_payload):
- # No endMonth with endYear
- res = app.put_json_api(user_one_url, start_month_dependency_payload, auth=user_one.auth, expect_errors=True)
- assert res.status_code == 400
- assert res.json['errors'][0]['detail'] == "'startYear' is a dependency of 'startMonth'"
+ def test_false_ongoing_without_start_date_should_fail(self, app, request_payload, user_one_url, user_one, request_key, user_attr):
+ request_payload['data']['attributes'][request_key][0].pop('startYear')
+ res = app.put_json_api(user_one_url, request_payload, auth=user_one.auth, expect_errors=True)
+ user_one.reload()
+ assert res.json['errors'][0]['detail'] == "'startYear' is a required property"
+ assert not getattr(user_one, user_attr)
- def test_user_put_profile_date_validate_start_date_no_end_date_not_ongoing(self, app, user_one, user_attr, user_one_url, start_dates_no_end_dates_payload, request_key):
- # End date is greater then start date
- res = app.put_json_api(user_one_url, start_dates_no_end_dates_payload, auth=user_one.auth, expect_errors=True)
+ def test_false_ongoing_without_end_date_should_fail(self, app, request_payload, user_one_url, user_one, request_key, user_attr):
+ request_payload['data']['attributes'][request_key][0].pop('endYear')
+ res = app.put_json_api(user_one_url, request_payload, auth=user_one.auth, expect_errors=True)
user_one.reload()
- assert res.status_code == 400
+ assert res.json['errors'][0]['detail'] == "'endYear' is a required property"
+ assert not getattr(user_one, user_attr)
- def test_user_put_profile_date_validate_end_date_no_start_date(self, app, user_one, user_attr, user_one_url, end_dates_no_start_dates_payload, request_key):
- # End dates, but no start dates
- res = app.put_json_api(user_one_url, end_dates_no_start_dates_payload, auth=user_one.auth, expect_errors=True)
+ def test_true_ongoing_without_start_date_should_fail(self, app, request_payload, user_one_url, user_one, request_key, user_attr):
+ request_payload['data']['attributes'][request_key][0].pop('startYear')
+ res = app.put_json_api(user_one_url, request_payload, auth=user_one.auth, expect_errors=True)
user_one.reload()
- assert res.status_code == 400
- assert res.json['errors'][0]['detail'] == "'startYear' is a dependency of 'endYear'"
+ assert res.json['errors'][0]['detail'] == "'startYear' is a required property"
+ assert not getattr(user_one, user_attr)
+
+ def test_true_ongoing_without_end_date_should_succeed(self, app, request_payload, user_one_url, user_one, request_key, user_attr):
+ request_payload['data']['attributes'][request_key][0]['ongoing'] = True
+ request_payload['data']['attributes'][request_key][0].pop('endYear')
+ # to avoid dependency error
+ request_payload['data']['attributes'][request_key][0].pop('endMonth')
+
+ res = app.put_json_api(user_one_url, request_payload, auth=user_one.auth, expect_errors=True)
+ user_one.reload()
+ assert res.status_code == 200
+ assert getattr(user_one, user_attr)[0]['startYear'] == request_payload['data']['attributes'][request_key][0]['startYear']
@pytest.mark.django_db
From b21bf71a46d04065386e784df6345dead77faccc Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Thu, 24 Jul 2025 14:32:42 -0400
Subject: [PATCH 114/336] update registration detail
---
.../registrations/views/test_registration_detail.py | 13 ++++++-------
1 file changed, 6 insertions(+), 7 deletions(-)
diff --git a/api_tests/registrations/views/test_registration_detail.py b/api_tests/registrations/views/test_registration_detail.py
index 39348e1f3c4..68222090042 100644
--- a/api_tests/registrations/views/test_registration_detail.py
+++ b/api_tests/registrations/views/test_registration_detail.py
@@ -10,10 +10,8 @@
from api_tests.subjects.mixins import UpdateSubjectsMixin
from osf.utils import permissions
from osf.utils.workflows import ApprovalStates
-from osf.models import Registration, NodeLog, NodeLicense, SchemaResponse
+from osf.models import Registration, NodeLog, NodeLicense, SchemaResponse, NotificationType
from framework.auth import Auth
-from website.project.signals import contributor_added
-from api_tests.utils import disconnected_from_listeners
from api.registrations.serializers import RegistrationSerializer, RegistrationDetailSerializer
from addons.wiki.tests.factories import WikiFactory, WikiVersionFactory
from osf.migrations import update_provider_auth_groups
@@ -32,7 +30,7 @@
from osf_tests.utils import get_default_test_schema
from api_tests.nodes.views.test_node_detail_license import TestNodeUpdateLicense
-from tests.utils import assert_latest_log
+from tests.utils import assert_latest_log, capture_notifications
from api_tests.utils import create_test_file
@@ -786,9 +784,9 @@ def test_initiate_withdrawal_with_embargo_ends_embargo(
assert not public_registration.is_pending_embargo
def test_withdraw_request_does_not_send_email_to_unregistered_admins(
- self, mock_notification_send, app, user, public_registration, public_url, public_payload):
+ self, app, user, public_registration, public_url, public_payload):
unreg = UnregUserFactory()
- with disconnected_from_listeners(contributor_added):
+ with capture_notifications() as notifications:
public_registration.add_unregistered_contributor(
unreg.fullname,
unreg.email,
@@ -802,7 +800,8 @@ def test_withdraw_request_does_not_send_email_to_unregistered_admins(
# Only the creator gets an email; the unreg user does not get emailed
assert public_registration._contributors.count() == 2
- assert mock_notification_send.call_count == 3
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
@pytest.mark.django_db
From 7e96d6f7349905e9879aaec31dbe4b477efb2907 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Fri, 25 Jul 2025 09:21:54 -0400
Subject: [PATCH 115/336] fix invite and institutional admin contributor tests
---
osf/models/notification_type.py | 1 +
osf/utils/notifications.py | 7 +---
.../test_institutional_admin_contributors.py | 6 +--
tests/test_adding_contributor_views.py | 40 ++++++++++---------
website/reviews/listeners.py | 2 +-
5 files changed, 29 insertions(+), 27 deletions(-)
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 19fee3e10e8..1944ba8f923 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -111,6 +111,7 @@ class Type(str, Enum):
# Provider notifications
PROVIDER_NEW_PENDING_SUBMISSIONS = 'provider_new_pending_submissions'
+ PROVIDER_NEW_PENDING_WITHDRAW_REQUESTS = 'provider_new_pending_withdraw_requests'
PROVIDER_REVIEWS_SUBMISSION_CONFIRMATION = 'provider_reviews_submission_confirmation'
PROVIDER_REVIEWS_MODERATOR_SUBMISSION_CONFIRMATION = 'provider_reviews_moderator_submission_confirmation'
PROVIDER_REVIEWS_WITHDRAWAL_REQUESTED = 'preprint_request_withdrawal_requested'
diff --git a/osf/utils/notifications.py b/osf/utils/notifications.py
index 910421ab476..8e432af12a5 100644
--- a/osf/utils/notifications.py
+++ b/osf/utils/notifications.py
@@ -135,14 +135,11 @@ def notify_withdraw_registration(resource, action, *args, **kwargs):
context['notify_comment'] = not resource.provider.reviews_comments_private and action.comment
for contributor in resource.contributors.all():
- context['contributor'] = contributor
+ context['contributor_fullname'] = contributor.fullname
context['is_requester'] = resource.retraction.initiated_by == contributor
NotificationType.objects.get(
name=NotificationType.Type.PREPRINT_REQUEST_WITHDRAWAL_APPROVED
).emit(
user=contributor,
- event_context={
- 'is_requester': contributor,
-
- },
+ event_context=context
)
diff --git a/osf_tests/test_institutional_admin_contributors.py b/osf_tests/test_institutional_admin_contributors.py
index 93ba0ac1305..62d4205eeb2 100644
--- a/osf_tests/test_institutional_admin_contributors.py
+++ b/osf_tests/test_institutional_admin_contributors.py
@@ -142,7 +142,7 @@ def test_requested_permissions_or_default(self, app, project, institutional_admi
auth=mock.ANY,
permissions=permissions.ADMIN, # `requested_permissions` should take precedence
visible=True,
- send_email='access_request',
+ send_email='access',
make_curator=False,
)
@@ -168,7 +168,7 @@ def test_permissions_override_requested_permissions(self, app, project, institut
auth=mock.ANY,
permissions=permissions.ADMIN, # `requested_permissions` should take precedence
visible=True,
- send_email='access_request',
+ send_email='access',
make_curator=False,
)
@@ -194,6 +194,6 @@ def test_requested_permissions_is_used(self, app, project, institutional_admin):
auth=mock.ANY,
permissions=permissions.ADMIN, # `requested_permissions` should take precedence
visible=True,
- send_email='access_request',
+ send_email='access',
make_curator=False,
)
diff --git a/tests/test_adding_contributor_views.py b/tests/test_adding_contributor_views.py
index 5825a0b42b5..0d67e246010 100644
--- a/tests/test_adding_contributor_views.py
+++ b/tests/test_adding_contributor_views.py
@@ -1,8 +1,6 @@
-
from unittest.mock import ANY
import time
-from http.cookies import SimpleCookie
from unittest import mock
import pytest
@@ -197,10 +195,9 @@ def test_add_contributors_post_only_sends_one_email_to_unreg_user(self, mock_sen
# send request
url = self.project.api_url_for('project_contributors_post')
assert self.project.can_edit(user=self.creator)
- self.app.post(url, json=payload, auth=self.creator.auth)
-
- # finalize_invitation should only have been called once
- assert mock_send_claim_email.call_count == 1
+ with capture_notifications() as noitification:
+ self.app.post(url, json=payload, auth=self.creator.auth)
+ assert len(noitification) == 1
def test_add_contributors_post_only_sends_one_email_to_registered_user(self):
# Project has components
@@ -506,22 +503,28 @@ def test_send_claim_email_to_given_email(self):
auth=Auth(project.creator),
)
project.save()
- send_claim_email(email=given_email, unclaimed_user=unreg_user, node=project)
+ with capture_notifications() as notifications:
+ send_claim_email(email=given_email, unclaimed_user=unreg_user, node=project)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_INVITE_DEFAULT
- self.mock_notification_send.assert_called()
def test_send_claim_email_to_referrer(self):
project = ProjectFactory()
referrer = project.creator
given_email, real_email = fake_email(), fake_email()
- unreg_user = project.add_unregistered_contributor(fullname=fake.name(),
- email=given_email, auth=Auth(
- referrer)
- )
+ unreg_user = project.add_unregistered_contributor(
+ fullname=fake.name(),
+ email=given_email,
+ auth=Auth(referrer)
+ )
project.save()
- send_claim_email(email=real_email, unclaimed_user=unreg_user, node=project)
+ with capture_notifications() as notifications:
+ send_claim_email(email=real_email, unclaimed_user=unreg_user, node=project)
- assert self.mock_notification_send.called
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.USER_PENDING_VERIFICATION
+ assert notifications[1]['type'] == NotificationType.Type.USER_FORWARD_INVITE
def test_send_claim_email_before_throttle_expires(self):
project = ProjectFactory()
@@ -533,10 +536,11 @@ def test_send_claim_email_before_throttle_expires(self):
)
project.save()
send_claim_email(email=fake_email(), unclaimed_user=unreg_user, node=project)
- self.mock_notification_send.reset_mock()
# 2nd call raises error because throttle hasn't expired
- with pytest.raises(HTTPError):
- send_claim_email(email=fake_email(), unclaimed_user=unreg_user, node=project)
- assert not self.mock_notification_send.called
+
+ with capture_notifications() as notifications:
+ with pytest.raises(HTTPError):
+ send_claim_email(email=fake_email(), unclaimed_user=unreg_user, node=project)
+ assert not notifications
diff --git a/website/reviews/listeners.py b/website/reviews/listeners.py
index 52caa5fb3b0..a48d601e071 100644
--- a/website/reviews/listeners.py
+++ b/website/reviews/listeners.py
@@ -20,7 +20,7 @@ def reviews_withdraw_requests_notification_moderators(self, timestamp, context,
context['profile_image_url'] = get_profile_image_url(user)
context['reviews_submission_url'] = f'{DOMAIN}reviews/registries/{provider._id}/{resource._id}'
- for recipient in provider_subscription.preorint.moderators.all():
+ for recipient in provider_subscription.subscribed_object.get_group('moderator').user_set.all():
NotificationType.objects.get(
name=NotificationType.Type.PROVIDER_NEW_PENDING_WITHDRAW_REQUESTS
).emit(
From 7dc8375a91a0e46d2548b52363ba0bd8ea6d4f10 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Fri, 25 Jul 2025 12:11:14 -0400
Subject: [PATCH 116/336] clean up test_addons code
---
scripts/add_global_subscriptions.py | 60 ------------
tests/test_adding_contributor_views.py | 126 ++++++++++++++-----------
website/notifications/emails.py | 45 +++------
website/notifications/events/files.py | 2 +-
website/project/views/contributor.py | 6 +-
5 files changed, 87 insertions(+), 152 deletions(-)
delete mode 100644 scripts/add_global_subscriptions.py
diff --git a/scripts/add_global_subscriptions.py b/scripts/add_global_subscriptions.py
deleted file mode 100644
index 52746875d79..00000000000
--- a/scripts/add_global_subscriptions.py
+++ /dev/null
@@ -1,60 +0,0 @@
-"""
-This migration subscribes each user to USER_SUBSCRIPTIONS_AVAILABLE if a subscription
-does not already exist.
-"""
-
-import logging
-import sys
-
-from osf.models.notifications import NotificationSubscriptionLegacy
-from website.app import setup_django
-setup_django()
-
-from django.apps import apps
-from django.db import transaction
-from website.app import init_app
-from website.notifications import constants
-from website.notifications.utils import to_subscription_key
-
-from scripts import utils as scripts_utils
-
-logger = logging.getLogger(__name__)
-
-def add_global_subscriptions(dry=True):
- OSFUser = apps.get_model('osf.OSFUser')
- notification_type = 'email_transactional'
- user_events = constants.USER_SUBSCRIPTIONS_AVAILABLE
-
- count = 0
-
- with transaction.atomic():
- for user in OSFUser.objects.filter(is_registered=True, date_confirmed__isnull=False):
- changed = False
- if not user.is_active:
- continue
- for user_event in user_events:
- user_event_id = to_subscription_key(user._id, user_event)
-
- subscription = NotificationSubscriptionLegacy.load(user_event_id)
- if not subscription:
- logger.info(f'No {user_event} subscription found for user {user._id}. Subscribing...')
- subscription = NotificationSubscriptionLegacy(_id=user_event_id, owner=user, event_name=user_event)
- subscription.save() # Need to save in order to access m2m fields
- subscription.add_user_to_subscription(user, notification_type)
- subscription.save()
- changed = True
- else:
- logger.info(f'User {user._id} already has a {user_event} subscription')
- if changed:
- count += 1
-
- logger.info(f'Added subscriptions for {count} users')
- if dry:
- raise RuntimeError('Dry mode -- rolling back transaction')
-
-if __name__ == '__main__':
- dry = '--dry' in sys.argv
- init_app(routes=False)
- if not dry:
- scripts_utils.add_file_logger(logger, __file__)
- add_global_subscriptions(dry=dry)
diff --git a/tests/test_adding_contributor_views.py b/tests/test_adding_contributor_views.py
index 0d67e246010..30e38b3425a 100644
--- a/tests/test_adding_contributor_views.py
+++ b/tests/test_adding_contributor_views.py
@@ -1,5 +1,3 @@
-from unittest.mock import ANY
-
import time
from unittest import mock
@@ -30,13 +28,11 @@
)
from tests.utils import capture_notifications
from website.profile.utils import add_contributor_json, serialize_unregistered
-from website.project.signals import contributor_added
from website.project.views.contributor import (
deserialize_contributors,
notify_added_contributor,
send_claim_email,
)
-from conftest import start_mock_notification_send
@pytest.mark.enable_implicit_clean
class TestAddingContributorViews(OsfTestCase):
@@ -46,10 +42,6 @@ def setUp(self):
self.creator = AuthUserFactory()
self.project = ProjectFactory(creator=self.creator)
self.auth = Auth(self.project.creator)
- # Authenticate all requests
- contributor_added.connect(notify_added_contributor)
-
- self.mock_notification_send = start_mock_notification_send(self)
def test_serialize_unregistered_without_record(self):
name, email = fake.name(), fake_email()
@@ -197,7 +189,10 @@ def test_add_contributors_post_only_sends_one_email_to_unreg_user(self, mock_sen
assert self.project.can_edit(user=self.creator)
with capture_notifications() as noitification:
self.app.post(url, json=payload, auth=self.creator.auth)
- assert len(noitification) == 1
+ assert len(noitification) == 3
+ assert noitification[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ assert noitification[1]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ assert noitification[2]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
def test_add_contributors_post_only_sends_one_email_to_registered_user(self):
# Project has components
@@ -251,10 +246,14 @@ def test_add_contributors_post_sends_email_if_user_not_contributor_on_parent_nod
# send request
url = self.project.api_url_for('project_contributors_post')
assert self.project.can_edit(user=self.creator)
- self.app.post(url, json=payload, auth=self.creator.auth)
+ with capture_notifications() as notifications:
+ self.app.post(url, json=payload, auth=self.creator.auth)
# send_mail is called for both the project and the sub-component
- assert self.mock_notification_send.call_count == 2
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ assert notifications[1]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+
@mock.patch('website.project.views.contributor.send_claim_email')
def test_email_sent_when_unreg_user_is_added(self, send_mail):
@@ -272,8 +271,9 @@ def test_email_sent_when_unreg_user_is_added(self, send_mail):
'node_ids': []
}
url = self.project.api_url_for('project_contributors_post')
- self.app.post(url, json=payload, follow_redirects=True, auth=self.creator.auth)
- send_mail.assert_called_with(email, ANY,ANY,notify=True, email_template='default')
+ with capture_notifications() as notifications:
+ self.app.post(url, json=payload, follow_redirects=True, auth=self.creator.auth)
+ assert len(notifications) == 1
def test_email_sent_when_reg_user_is_added(self):
contributor = UserFactory()
@@ -283,52 +283,61 @@ def test_email_sent_when_reg_user_is_added(self):
'permissions': permissions.WRITE
}]
project = ProjectFactory(creator=self.auth.user)
- project.add_contributors(contributors, auth=self.auth)
- project.save()
- assert self.mock_notification_send.called
+ with capture_notifications() as notifications:
+ project.add_contributors(contributors, auth=self.auth)
+ project.save()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
contributor.refresh_from_db()
assert contributor.contributor_added_email_records[project._id]['last_sent'] == approx(int(time.time()), rel=1)
def test_contributor_added_email_sent_to_unreg_user(self):
unreg_user = UnregUserFactory()
project = ProjectFactory()
- project.add_unregistered_contributor(fullname=unreg_user.fullname, email=unreg_user.email, auth=Auth(project.creator))
- project.save()
- assert self.mock_notification_send.called
+ with capture_notifications() as notifications:
+ project.add_unregistered_contributor(fullname=unreg_user.fullname, email=unreg_user.email, auth=Auth(project.creator))
+ project.save()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
def test_forking_project_does_not_send_contributor_added_email(self):
project = ProjectFactory()
- project.fork_node(auth=Auth(project.creator))
- assert not self.mock_notification_send.called
+ with capture_notifications() as notifications:
+ project.fork_node(auth=Auth(project.creator))
+ assert not notifications
def test_templating_project_does_not_send_contributor_added_email(self):
project = ProjectFactory()
- project.use_as_template(auth=Auth(project.creator))
- assert not self.mock_notification_send.called
+ with capture_notifications() as notifications:
+ project.use_as_template(auth=Auth(project.creator))
+ assert not notifications
@mock.patch('website.archiver.tasks.archive')
def test_registering_project_does_not_send_contributor_added_email(self, mock_archive):
project = ProjectFactory()
provider = RegistrationProviderFactory()
- project.register_node(
- get_default_metaschema(),
- Auth(user=project.creator),
- DraftRegistrationFactory(branched_from=project),
- None,
- provider=provider
- )
- assert not self.mock_notification_send.called
+ with capture_notifications() as notifications:
+ project.register_node(
+ get_default_metaschema(),
+ Auth(user=project.creator),
+ DraftRegistrationFactory(branched_from=project),
+ None,
+ provider=provider
+ )
+ assert not notifications
def test_notify_contributor_email_does_not_send_before_throttle_expires(self):
contributor = UserFactory()
project = ProjectFactory()
auth = Auth(project.creator)
- notify_added_contributor(project, contributor, auth)
- assert self.mock_notification_send.called
+ with capture_notifications() as notifications:
+ notify_added_contributor(project, contributor, 'default', auth)
+ assert len(notifications) == 1
# 2nd call does not send email because throttle period has not expired
- notify_added_contributor(project, contributor, auth)
- assert self.mock_notification_send.call_count == 1
+ with capture_notifications() as notifications:
+ notify_added_contributor(project, contributor, 'default', auth)
+ assert not notifications
def test_notify_contributor_email_sends_after_throttle_expires(self):
throttle = 0.5
@@ -336,38 +345,45 @@ def test_notify_contributor_email_sends_after_throttle_expires(self):
contributor = UserFactory()
project = ProjectFactory()
auth = Auth(project.creator)
- notify_added_contributor(project, contributor, auth, throttle=throttle)
- assert self.mock_notification_send.called
+ with capture_notifications() as notifications:
+ notify_added_contributor(project, contributor, 'default', auth, throttle=throttle)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
time.sleep(1) # throttle period expires
- notify_added_contributor(project, contributor, auth, throttle=throttle)
- assert self.mock_notification_send.call_count == 2
+ with capture_notifications() as notifications:
+ notify_added_contributor(project, contributor, 'default', auth, throttle=throttle)
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ assert notifications[1]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
def test_add_contributor_to_fork_sends_email(self):
contributor = UserFactory()
- fork = self.project.fork_node(auth=Auth(self.creator))
- fork.add_contributor(contributor, auth=Auth(self.creator))
- fork.save()
- assert self.mock_notification_send.called
- assert self.mock_notification_send.call_count == 1
+ with capture_notifications() as notifications:
+ fork = self.project.fork_node(auth=Auth(self.creator))
+ fork.add_contributor(contributor, auth=Auth(self.creator))
+ fork.save()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
def test_add_contributor_to_template_sends_email(self):
contributor = UserFactory()
- template = self.project.use_as_template(auth=Auth(self.creator))
- template.add_contributor(contributor, auth=Auth(self.creator))
- template.save()
- assert self.mock_notification_send.called
- assert self.mock_notification_send.call_count == 1
+ with capture_notifications() as notifications:
+ template = self.project.use_as_template(auth=Auth(self.creator))
+ template.add_contributor(contributor, auth=Auth(self.creator))
+ template.save()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
def test_creating_fork_does_not_email_creator(self):
- contributor = UserFactory()
- fork = self.project.fork_node(auth=Auth(self.creator))
- assert not self.mock_notification_send.called
+ with capture_notifications() as notifications:
+ self.project.fork_node(auth=Auth(self.creator))
+ assert not notifications
def test_creating_template_does_not_email_creator(self):
- contributor = UserFactory()
- template = self.project.use_as_template(auth=Auth(self.creator))
- assert not self.mock_notification_send.called
+ with capture_notifications() as notifications:
+ self.project.use_as_template(auth=Auth(self.creator))
+ assert not notifications
def test_add_multiple_contributors_only_adds_one_log(self):
n_logs_pre = self.project.logs.count()
diff --git a/website/notifications/emails.py b/website/notifications/emails.py
index da2024e8e31..9c34867ad3a 100644
--- a/website/notifications/emails.py
+++ b/website/notifications/emails.py
@@ -1,8 +1,9 @@
from django.apps import apps
from babel import dates, core, Locale
+from django.contrib.contenttypes.models import ContentType
-from osf.models import AbstractNode, NotificationSubscription
+from osf.models import AbstractNode, NotificationSubscription, NotificationType
from osf.models.notifications import NotificationDigest
from osf.utils.permissions import ADMIN, READ
from website import mails
@@ -22,37 +23,14 @@ def notify(event, user, node, timestamp, **context):
target_user: used with comment_replies
:return: List of user ids notifications were sent to
"""
- sent_users = []
- # The user who the current comment is a reply to
- target_user = context.get('target_user', None)
- exclude = context.get('exclude', [])
- # do not notify user who initiated the emails
- exclude.append(user._id)
-
- event_type = utils.find_subscription_type(event)
- if target_user and event_type in constants.USER_SUBSCRIPTIONS_AVAILABLE:
- # global user
- subscriptions = get_user_subscriptions(target_user, event_type)
- else:
- # local project user
- subscriptions = compile_subscriptions(node, event_type, event)
-
- for notification_type in subscriptions:
- if notification_type == 'none' or not subscriptions[notification_type]:
- continue
- # Remove excluded ids from each notification type
- subscriptions[notification_type] = [guid for guid in subscriptions[notification_type] if guid not in exclude]
-
- # If target, they get a reply email and are removed from the general email
- if target_user and target_user._id in subscriptions[notification_type]:
- subscriptions[notification_type].remove(target_user._id)
- store_emails([target_user._id], notification_type, 'comment_replies', user, node, timestamp, **context)
- sent_users.append(target_user._id)
-
- if subscriptions[notification_type]:
- store_emails(subscriptions[notification_type], notification_type, event_type, user, node, timestamp, **context)
- sent_users.extend(subscriptions[notification_type])
- return sent_users
+ if event.endswith('_file_updated'):
+ NotificationType.objects.get(
+ name=NotificationType.Type.NODE_FILE_ADDED
+ ).emit(
+ user=user,
+ subscribed_object=node,
+ event_context=context
+ )
def notify_mentions(event, user, node, timestamp, **context):
OSFUser = apps.get_model('osf', 'OSFUser')
@@ -161,7 +139,8 @@ def check_node(node, event):
node_subscriptions = {key: [] for key in constants.NOTIFICATION_TYPES}
if node:
subscription = NotificationSubscription.objects.filter(
- node=node,
+ object_id=node.id,
+ content_type=ContentType.objects.get_for_model(node),
notification_type__name=event
)
for notification_type in node_subscriptions:
diff --git a/website/notifications/events/files.py b/website/notifications/events/files.py
index db8a9c91fdc..6a7c7cab3d9 100644
--- a/website/notifications/events/files.py
+++ b/website/notifications/events/files.py
@@ -68,7 +68,7 @@ def text_message(self):
@property
def event_type(self):
"""Most basic event type."""
- return 'file_updated'
+ return 'node_file_updated'
@property
def waterbutler_id(self):
diff --git a/website/project/views/contributor.py b/website/project/views/contributor.py
index ea4ec0f67be..0800afaf8ca 100644
--- a/website/project/views/contributor.py
+++ b/website/project/views/contributor.py
@@ -588,7 +588,7 @@ def check_email_throttle(node, contributor, throttle=None):
try:
notification_type = NotificationType.objects.get(
- name=NotificationType.Type.NODE_COMMENT.value # or whatever event type you're using for 'contributor added'
+ name=NotificationType.Type.NODE_COMMENT.value
)
except NotificationType.DoesNotExist:
return False # Fail-safe: if the notification type isn't set up, don't throttle
@@ -600,7 +600,7 @@ def check_email_throttle(node, contributor, throttle=None):
user=contributor,
notification_type=notification_type,
content_type=ContentType.objects.get_for_model(node),
- object_id=str(node.id)
+ object_id=node.id
).first()
if not subscription:
@@ -619,7 +619,7 @@ def check_email_throttle(node, contributor, throttle=None):
return False # No previous sent notification, not throttled
@contributor_added.connect
-def notify_added_contributor(node, contributor, auth=None, email_template=None, *args, **kwargs):
+def notify_added_contributor(node, contributor, email_template, auth=None, *args, **kwargs):
"""Send a notification to a contributor who was just added to a node.
Handles:
From fb6086bc22dbd21825ec20531143c2db74259d14 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Fri, 25 Jul 2025 12:56:45 -0400
Subject: [PATCH 117/336] fix up auth_views tests
---
api/users/views.py | 2 +-
framework/auth/views.py | 15 ++++++++-------
notifications.yaml | 20 ++++++++++++--------
osf/models/notification_type.py | 2 +-
tests/test_auth_views.py | 17 +++++++++++------
tests/test_events.py | 17 +++++++++++------
website/notifications/utils.py | 9 +++++----
7 files changed, 49 insertions(+), 33 deletions(-)
diff --git a/api/users/views.py b/api/users/views.py
index 590216ade10..3c7f16e17fb 100644
--- a/api/users/views.py
+++ b/api/users/views.py
@@ -786,7 +786,7 @@ def post(self, request, *args, **kwargs):
# Don't go anywhere
return JsonResponse(
{
- 'external_id_provider': external_id_provider.name,
+ 'external_id_provider': external_id_provider,
'auth_user_fullname': fullname,
},
status=status.HTTP_200_OK,
diff --git a/framework/auth/views.py b/framework/auth/views.py
index 7e4cd6ad234..81b362532e9 100644
--- a/framework/auth/views.py
+++ b/framework/auth/views.py
@@ -841,23 +841,24 @@ def send_confirm_email(user, email, renew=False, external_id_provider=None, exte
if external_id_provider and external_id:
# First time login through external identity provider, link or create an OSF account confirmation
if user.external_identity[external_id_provider][external_id] == 'CREATE':
- notificaton_type = NotificationType.Type.USER_EXTERNAL_LOGIN_CONFIRM_EMAIL_CREATE
+ notification_type = NotificationType.Type.USER_EXTERNAL_LOGIN_CONFIRM_EMAIL_CREATE
elif user.external_identity[external_id_provider][external_id] == 'LINK':
- notificaton_type = NotificationType.Type.USER_EXTERNAL_LOGIN_CONFIRM_EMAIL_LINK
+ notification_type = NotificationType.Type.USER_EXTERNAL_LOGIN_CONFIRM_EMAIL_LINK
elif merge_target:
# Merge account confirmation
- notificaton_type = NotificationType.Type.USER_CONFIRM_MERGE
+ notification_type = NotificationType.Type.USER_CONFIRM_MERGE
elif user.is_active:
# Add email confirmation
- notificaton_type = NotificationType.Type.USER_CONFIRM_EMAIL
+ notification_type = NotificationType.Type.USER_CONFIRM_EMAIL
elif campaign:
# Account creation confirmation: from campaign
- notificaton_type = campaigns.email_template_for_campaign(campaign)
+ notification_type = campaigns.email_template_for_campaign(campaign)
else:
# Account creation confirmation: from OSF
- notificaton_type = NotificationType.Type.USER_INITIAL_CONFIRM_EMAIL
+ notification_type = NotificationType.Type.USER_INITIAL_CONFIRM_EMAIL
- NotificationType.objects.get(name=notificaton_type).emit(
+ print(notification_type)
+ NotificationType.objects.get(name=notification_type).emit(
user=user,
event_context={
'user_fullname': user.fullname,
diff --git a/notifications.yaml b/notifications.yaml
index c5a3d7a6cb5..8ff1a5683d8 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -1,15 +1,11 @@
# This file contains the configuration for our notification system using the NotificationType object, this is intended to
-# exist as a simple declarative list of NotificationTypes and their attributes.
+# exist as a simple declarative list of NotificationTypes and their attributes to populate the type data.
-# Workflow:
-# 1. Add a new notification template
-# 2. Add a entry here with the desired notification types
-# 3. Add name tp Enum osf.notification.NotificationType.Type
-# 4. Use the emit method to send or subscribe the notification for immediate deliver or periodic digest.
notification_types:
- #### GLOBAL (User Notifications)
+ #### User Notifications
- name: user_pending_verification_registered
- __docs__: ...
+ __docs__: This email is sent when a user requests access to a node and has confirm their identity,
+ `referrer` is sent an email to forward the confirmation link.
object_content_type_model_name: osfuser
template: 'website/templates/emails/pending_registered.html.mako'
- name: user_pending_verification
@@ -132,6 +128,14 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/registration_bulk_upload_failure_duplicates.html.mako'
+ - name: user_external_login_email_confirm_link
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/external_confirm_link.html.mako'
+ - name: user_external_login_confirm_email_create
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/external_confirm_create.html.mako'
#### PROVIDER
- name: provider_new_pending_submissions
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 1944ba8f923..7c651c511b5 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -66,7 +66,7 @@ class Type(str, Enum):
USER_PASSWORD_RESET = 'user_password_reset'
USER_CONTRIBUTOR_ADDED_DRAFT_REGISTRATION = 'user_contributor_added_draft_registration'
USER_EXTERNAL_LOGIN_CONFIRM_EMAIL_CREATE = 'user_external_login_confirm_email_create'
- USER_EXTERNAL_LOGIN_CONFIRM_EMAIL_LINK = 'user_external_login_confirm_email_link'
+ USER_EXTERNAL_LOGIN_CONFIRM_EMAIL_LINK = 'user_external_login_email_confirm_link'
USER_CONFIRM_MERGE = 'user_confirm_merge'
USER_CONFIRM_EMAIL = 'user_confirm_email'
USER_INITIAL_CONFIRM_EMAIL = 'user_initial_confirm_email'
diff --git a/tests/test_auth_views.py b/tests/test_auth_views.py
index 31445da2c8d..4d385b68dd6 100644
--- a/tests/test_auth_views.py
+++ b/tests/test_auth_views.py
@@ -12,7 +12,7 @@
from django.utils import timezone
from flask import request
from rest_framework import status as http_status
-from tests.utils import run_celery_tasks
+from tests.utils import run_celery_tasks, capture_notifications
from framework import auth
from framework.auth import Auth, cas
@@ -25,7 +25,7 @@
)
from framework.auth.exceptions import InvalidTokenError
from framework.auth.views import login_and_register_handler
-from osf.models import OSFUser, NotableDomain
+from osf.models import OSFUser, NotableDomain, NotificationType
from osf_tests.factories import (
fake_email,
AuthUserFactory,
@@ -320,8 +320,11 @@ def test_resend_confirmation(self):
self.user.save()
url = api_url_for('resend_confirmation')
header = {'address': email, 'primary': False, 'confirmed': False}
- self.app.put(url, json={'id': self.user._id, 'email': header}, auth=self.user.auth)
- assert self.mock_send_grid.called
+ with capture_notifications() as notifications:
+ self.app.put(url, json={'id': self.user._id, 'email': header}, auth=self.user.auth)
+
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
self.user.reload()
assert token != self.user.get_confirmation_token(email)
@@ -497,8 +500,10 @@ def test_resend_confirmation_does_not_send_before_throttle_expires(self):
self.user.save()
url = api_url_for('resend_confirmation')
header = {'address': email, 'primary': False, 'confirmed': False}
- self.app.put(url, json={'id': self.user._id, 'email': header}, auth=self.user.auth)
- assert self.mock_send_grid.called
+ with capture_notifications() as notifications:
+ self.app.put(url, json={'id': self.user._id, 'email': header}, auth=self.user.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
# 2nd call does not send email because throttle period has not expired
res = self.app.put(url, json={'id': self.user._id, 'email': header}, auth=self.user.auth)
assert res.status_code == 400
diff --git a/tests/test_events.py b/tests/test_events.py
index e06559ebbb4..ca8793da6da 100644
--- a/tests/test_events.py
+++ b/tests/test_events.py
@@ -1,7 +1,11 @@
from collections import OrderedDict
from unittest import mock
+
+from django.contrib.contenttypes.models import ContentType
from pytest import raises
+
+from osf.models import NotificationType
from website.notifications.events.base import Event, register, event_registry
from website.notifications.events.files import (
FileAdded, FileRemoved, FolderCreated, FileUpdated,
@@ -184,11 +188,12 @@ def setUp(self):
self.user = factories.UserFactory()
self.consolidate_auth = Auth(user=self.user)
self.project = factories.ProjectFactory()
- self.project_subscription = factories.NotificationSubscriptionLegacyFactory(
- _id=self.project._id + '_file_updated',
- owner=self.project,
- event_name='file_updated'
+ self.project_subscription = factories.NotificationSubscription(
+ user=self.user,
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_ADDED),
)
+ self.project_subscription.object_id = self.project.id
+ self.project_subscription.content_type = ContentType.objects.get_for_model(self.project)
self.project_subscription.save()
self.user2 = factories.UserFactory()
self.event = event_registry['file_removed'](
@@ -196,12 +201,12 @@ def setUp(self):
)
def test_info_formed_correct_file(self):
- assert 'file_updated' == self.event.event_type
+ assert NotificationType.Type.NODE_FILE_UPDATED == self.event.event_type
assert f'removed file "{materialized.lstrip("/")}".' == self.event.html_message
assert f'removed file "{materialized.lstrip("/")}".' == self.event.text_message
def test_info_formed_correct_folder(self):
- assert 'file_updated' == self.event.event_type
+ assert NotificationType.Type.NODE_FILE_UPDATED == self.event.event_type
self.event.payload['metadata']['materialized'] += '/'
assert f'removed folder "{materialized.lstrip("/")}/".' == self.event.html_message
assert f'removed folder "{materialized.lstrip("/")}/".' == self.event.text_message
diff --git a/website/notifications/utils.py b/website/notifications/utils.py
index e64d76c258f..38707ac24a6 100644
--- a/website/notifications/utils.py
+++ b/website/notifications/utils.py
@@ -149,14 +149,15 @@ def users_to_remove(source_event, source_node, new_node):
removed_users = {key: [] for key in constants.NOTIFICATION_TYPES}
if source_node == new_node:
return removed_users
- old_sub = NotificationSubscription.objects.get(
- subscribed_object=source_node,
+ sub = NotificationSubscription.objects.get(
+ object_id=source_node.id,
+ content_type=ContentType.objects.get_for_model(source_node),
notification_type__name=source_event
)
for notification_type in constants.NOTIFICATION_TYPES:
users = []
- if hasattr(old_sub, notification_type):
- users += list(getattr(old_sub, notification_type).values_list('guids___id', flat=True))
+ if hasattr(sub, notification_type):
+ users += list(getattr(sub, notification_type).values_list('guids___id', flat=True))
return removed_users
From 53b89ac2851f0b53e49f0932bb8ee34ec29d436a Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Fri, 25 Jul 2025 13:32:06 -0400
Subject: [PATCH 118/336] fix up contributor and desk notifications
---
api/users/views.py | 13 ++++++++-----
notifications.yaml | 8 ++++++++
tests/test_user_profile_view.py | 17 ++++++++++-------
3 files changed, 26 insertions(+), 12 deletions(-)
diff --git a/api/users/views.py b/api/users/views.py
index 3c7f16e17fb..df2d2a215e6 100644
--- a/api/users/views.py
+++ b/api/users/views.py
@@ -103,7 +103,7 @@
)
from osf.utils.tokens import TokenHandler
from osf.utils.tokens.handlers import sanction_handler
-from website import mails, settings, language
+from website import settings, language
from website.project.views.contributor import send_claim_email, send_claim_registered_email
from website.util.metrics import CampaignClaimedTags, CampaignSourceTags
from framework.auth import exceptions
@@ -639,11 +639,14 @@ def create(self, request, *args, **kwargs):
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
user = self.get_user()
- mails.send_mail(
- to_addr=settings.OSF_SUPPORT_EMAIL,
- mail=mails.REQUEST_EXPORT,
+ NotificationType.objects.get(
+ name=NotificationType.Type.DESK_REQUEST_EXPORT,
+ ).emit(
user=user,
- can_change_preferences=False,
+ destination_address=settings.OSF_SUPPORT_EMAIL,
+ event_context={
+ 'can_change_preferences': False,
+ },
)
user.email_last_sent = timezone.now()
user.save()
diff --git a/notifications.yaml b/notifications.yaml
index 8ff1a5683d8..d1cade2fb9b 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -136,6 +136,10 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/external_confirm_create.html.mako'
+ - name: user_primary_email_changed
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/primary_email_changed.html.mako'
#### PROVIDER
- name: provider_new_pending_submissions
@@ -338,3 +342,7 @@ notification_types:
__docs__: ...
object_content_type_model_name: desk
template: 'website/templates/emails/support_request.html.mako'
+ - name: desk_request_export
+ __docs__: ...
+ object_content_type_model_name: desk
+ template: 'website/templates/emails/support_request.html.mako'
diff --git a/tests/test_user_profile_view.py b/tests/test_user_profile_view.py
index bb801340423..20095abfba1 100644
--- a/tests/test_user_profile_view.py
+++ b/tests/test_user_profile_view.py
@@ -10,7 +10,7 @@
from framework.celery_tasks import handlers
from osf.external.spam import tasks as spam_tasks
from osf.models import (
- NotableDomain
+ NotableDomain, NotificationType
)
from osf_tests.factories import (
fake_email,
@@ -23,6 +23,7 @@
fake,
OsfTestCase,
)
+from tests.utils import capture_notifications
from website import mailchimp_utils
from website.settings import MAILCHIMP_GENERAL_LIST
from website.util import api_url_for, web_url_for
@@ -720,15 +721,17 @@ def test_password_change_invalid_empty_string_confirm_password(self):
def test_password_change_invalid_blank_confirm_password(self):
self.test_password_change_invalid_blank_password('password', 'new password', ' ')
- @mock.patch('website.mails.settings.USE_EMAIL', True)
- @mock.patch('website.mails.settings.USE_CELERY', False)
def test_user_cannot_request_account_export_before_throttle_expires(self):
url = api_url_for('request_export')
- self.app.post(url, auth=self.user.auth)
- assert self.mock_send_grid.called
- res = self.app.post(url, auth=self.user.auth)
+ with capture_notifications() as notifications:
+ self.app.post(url, auth=self.user.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.DESK_REQUEST_EXPORT
+
+ with capture_notifications() as notifications:
+ res = self.app.post(url, auth=self.user.auth)
assert res.status_code == 400
- assert self.mock_send_grid.call_count == 1
+ assert len(notifications) == 0
def test_get_unconfirmed_emails_exclude_external_identity(self):
external_identity = {
From 46b69441b6031f4359072c20f301554864949a3d Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Fri, 25 Jul 2025 13:39:00 -0400
Subject: [PATCH 119/336] deletes detect_duplicate notifications as that only
applied to legacy notifications
---
admin/nodes/views.py | 5 +----
admin/notifications/views.py | 30 ++-----------------------
admin_tests/notifications/test_views.py | 24 +++++---------------
3 files changed, 9 insertions(+), 50 deletions(-)
diff --git a/admin/nodes/views.py b/admin/nodes/views.py
index 40cf261945d..971b4a8cd6d 100644
--- a/admin/nodes/views.py
+++ b/admin/nodes/views.py
@@ -22,7 +22,7 @@
from admin.base.utils import change_embargo_date
from admin.base.views import GuidView
from admin.base.forms import GuidForm
-from admin.notifications.views import detect_duplicate_notifications, delete_selected_notifications
+from admin.notifications.views import delete_selected_notifications
from api.share.utils import update_share
from api.caching.tasks import update_storage_usage_cache
@@ -101,13 +101,10 @@ def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
node = self.get_object()
- detailed_duplicates = detect_duplicate_notifications(node_id=node.id)
-
context.update({
'SPAM_STATUS': SpamStatus,
'STORAGE_LIMITS': settings.StorageLimits,
'node': node,
- 'duplicates': detailed_duplicates
})
return context
diff --git a/admin/notifications/views.py b/admin/notifications/views.py
index 3546878e9af..6719ac90a8a 100644
--- a/admin/notifications/views.py
+++ b/admin/notifications/views.py
@@ -1,30 +1,4 @@
-from osf.models.notifications import NotificationSubscriptionLegacy
-from django.db.models import Count
+from osf.models.notification_subscription import NotificationSubscription
def delete_selected_notifications(selected_ids):
- NotificationSubscriptionLegacy.objects.filter(id__in=selected_ids).delete()
-
-def detect_duplicate_notifications(node_id=None):
- query = NotificationSubscriptionLegacy.objects.values('_id').annotate(count=Count('_id')).filter(count__gt=1)
- if node_id:
- query = query.filter(node_id=node_id)
-
- detailed_duplicates = []
- for dup in query:
- notifications = NotificationSubscriptionLegacy.objects.filter(
- _id=dup['_id']
- ).order_by('created')
-
- for notification in notifications:
- detailed_duplicates.append({
- 'id': notification.id,
- '_id': notification._id,
- 'event_name': notification.event_name,
- 'created': notification.created,
- 'count': dup['count'],
- 'email_transactional': [u._id for u in notification.email_transactional.all()],
- 'email_digest': [u._id for u in notification.email_digest.all()],
- 'none': [u._id for u in notification.none.all()]
- })
-
- return detailed_duplicates
+ NotificationSubscription.objects.filter(id__in=selected_ids).delete()
diff --git a/admin_tests/notifications/test_views.py b/admin_tests/notifications/test_views.py
index 42d182a77e5..e2003b1cbf8 100644
--- a/admin_tests/notifications/test_views.py
+++ b/admin_tests/notifications/test_views.py
@@ -3,9 +3,8 @@
from osf.models import OSFUser, Node
from admin.notifications.views import (
delete_selected_notifications,
- detect_duplicate_notifications,
)
-from osf.models.notifications import NotificationSubscriptionLegacy
+from osf.models.notification_subscription import NotificationSubscription
from tests.base import AdminTestCase
pytestmark = pytest.mark.django_db
@@ -19,22 +18,11 @@ def setUp(self):
self.request_factory = RequestFactory()
def test_delete_selected_notifications(self):
- notification1 = NotificationSubscriptionLegacy.objects.create(user=self.user, node=self.node, event_name='event1')
- notification2 = NotificationSubscriptionLegacy.objects.create(user=self.user, node=self.node, event_name='event2')
- notification3 = NotificationSubscriptionLegacy.objects.create(user=self.user, node=self.node, event_name='event3')
+ notification1 = NotificationSubscription.objects.create(user=self.user)
+ notification2 = NotificationSubscription.objects.create(user=self.user)
+ notification3 = NotificationSubscription.objects.create(user=self.user)
delete_selected_notifications([notification1.id, notification2.id])
- assert not NotificationSubscriptionLegacy.objects.filter(id__in=[notification1.id, notification2.id]).exists()
- assert NotificationSubscriptionLegacy.objects.filter(id=notification3.id).exists()
-
- def test_detect_duplicate_notifications(self):
- NotificationSubscriptionLegacy.objects.create(user=self.user, node=self.node, event_name='event1')
- NotificationSubscriptionLegacy.objects.create(user=self.user, node=self.node, event_name='event1')
- NotificationSubscriptionLegacy.objects.create(user=self.user, node=self.node, event_name='event2')
-
- duplicates = detect_duplicate_notifications()
-
- print(f"Detected duplicates: {duplicates}")
-
- assert len(duplicates) == 3, f"Expected 3 duplicates, but found {len(duplicates)}"
+ assert not NotificationSubscription.objects.filter(id__in=[notification1.id, notification2.id]).exists()
+ assert NotificationSubscription.objects.filter(id=notification3.id).exists()
From dcdbdc79568cc49c7a128f8214dc3eb1408d62b5 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Fri, 25 Jul 2025 15:25:47 -0400
Subject: [PATCH 120/336] fix spam ban notification type and split up webtests
into smaller files
---
notifications.yaml | 4 +
osf/models/mixins.py | 21 +-
osf/models/preprint.py | 2 +-
osf/models/schema_response.py | 2 +-
tests/test_events.py | 11 +-
tests/test_forgot_password.py | 237 +++++++++++
tests/test_preprints.py | 20 +-
tests/test_resend_confirmation.py | 83 ++++
tests/test_user_claiming.py | 267 +++++++++++++
tests/test_webtests.py | 642 +-----------------------------
10 files changed, 623 insertions(+), 666 deletions(-)
create mode 100644 tests/test_forgot_password.py
create mode 100644 tests/test_resend_confirmation.py
create mode 100644 tests/test_user_claiming.py
diff --git a/notifications.yaml b/notifications.yaml
index d1cade2fb9b..5be39abc492 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -140,6 +140,10 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/primary_email_changed.html.mako'
+ - name: user_spam_banned
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/spam_user_banned.html.mako'
#### PROVIDER
- name: provider_new_pending_submissions
diff --git a/osf/models/mixins.py b/osf/models/mixins.py
index e22fd7b97f5..3cbb2283aab 100644
--- a/osf/models/mixins.py
+++ b/osf/models/mixins.py
@@ -26,6 +26,7 @@
InvalidTagError,
BlockedEmailError,
)
+from osf.models.notification_type import NotificationType
from .node_relation import NodeRelation
from .nodelog import NodeLog
from .subject import Subject
@@ -54,7 +55,7 @@
from osf.utils.requests import get_request_and_user_id
from website.project import signals as project_signals
-from website import settings, mails, language
+from website import settings, language
from website.project.licenses import set_license
logger = logging.getLogger(__name__)
@@ -306,7 +307,7 @@ def add_affiliated_institution(self, inst, user, log=True, ignore_user_affiliati
if not self.is_affiliated_with_institution(inst):
self.affiliated_institutions.add(inst)
self.update_search()
- from . import NotificationType
+ from osf.models.notification_type import NotificationType
if notify and getattr(self, 'type', False) == 'osf.node':
for user, _ in self.get_admin_contributors_recursive(unique_users=True):
@@ -348,7 +349,7 @@ def remove_affiliated_institution(self, inst, user, save=False, log=True, notify
if save:
self.save()
self.update_search()
- from . import NotificationType
+ from osf.models.notification_type import NotificationType
if notify and getattr(self, 'type', False) == 'osf.node':
for user, _ in self.get_admin_contributors_recursive(unique_users=True):
@@ -2272,12 +2273,14 @@ def suspend_spam_user(self, user):
user.flag_spam()
if not user.is_disabled:
user.deactivate_account()
- mails.send_mail(
- to_addr=user.username,
- mail=mails.SPAM_USER_BANNED,
- user=user,
- osf_support_email=settings.OSF_SUPPORT_EMAIL,
- can_change_preferences=False,
+ NotificationType.objects.get(
+ name=NotificationType.Type.USER_SPAM_BANNED
+ ).emit(
+ user,
+ event_context={
+ 'osf_support_email': settings.OSF_SUPPORT_EMAIL,
+ 'can_change_preferences': False
+ }
)
user.save()
diff --git a/osf/models/preprint.py b/osf/models/preprint.py
index 17e792e15aa..b6c864bcf83 100644
--- a/osf/models/preprint.py
+++ b/osf/models/preprint.py
@@ -20,7 +20,7 @@
from framework.auth import Auth
from framework.exceptions import PermissionsError, UnpublishedPendingPreprintVersionExists
from framework.auth import oauth_scopes
-from . import NotificationType
+from osf.models.notification_type import NotificationType
from .subject import Subject
from .tag import Tag
diff --git a/osf/models/schema_response.py b/osf/models/schema_response.py
index 84d0a8f46de..3c4f65155fb 100644
--- a/osf/models/schema_response.py
+++ b/osf/models/schema_response.py
@@ -9,7 +9,7 @@
from framework.exceptions import PermissionsError
from osf.exceptions import PreviousSchemaResponseError, SchemaResponseStateError, SchemaResponseUpdateError
-from . import NotificationType
+from osf.models.notification_type import NotificationType
from .base import BaseModel, ObjectIDMixin
from .metaschema import RegistrationSchemaBlock
from .schema_response_block import SchemaResponseBlock
diff --git a/tests/test_events.py b/tests/test_events.py
index ca8793da6da..e98119e61b9 100644
--- a/tests/test_events.py
+++ b/tests/test_events.py
@@ -188,7 +188,7 @@ def setUp(self):
self.user = factories.UserFactory()
self.consolidate_auth = Auth(user=self.user)
self.project = factories.ProjectFactory()
- self.project_subscription = factories.NotificationSubscription(
+ self.project_subscription = factories.NotificationSubscriptionFactory(
user=self.user,
notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_ADDED),
)
@@ -224,10 +224,9 @@ def setUp(self):
self.user = factories.UserFactory()
self.consolidate_auth = Auth(user=self.user)
self.project = factories.ProjectFactory()
- self.project_subscription = factories.NotificationSubscriptionLegacyFactory(
- _id=self.project._id + '_file_updated',
- owner=self.project,
- event_name='file_updated'
+ self.project_subscription = factories.NotificationSubscriptionFactory(
+ user=self.user,
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_ADDED),
)
self.project_subscription.save()
self.user2 = factories.UserFactory()
@@ -236,7 +235,7 @@ def setUp(self):
)
def test_info_formed_correct(self):
- assert 'file_updated' == self.event.event_type
+ assert NotificationType.Type.NODE_FILE_UPDATED == self.event.event_type
assert 'created folder "Three/".' == self.event.html_message
assert 'created folder "Three/".' == self.event.text_message
diff --git a/tests/test_forgot_password.py b/tests/test_forgot_password.py
new file mode 100644
index 00000000000..9ca6df4fdab
--- /dev/null
+++ b/tests/test_forgot_password.py
@@ -0,0 +1,237 @@
+from urllib.parse import quote_plus
+
+from osf.models import NotificationType
+from tests.base import OsfTestCase
+from osf_tests.factories import (
+ AuthUserFactory,
+ UserFactory,
+)
+from tests.utils import capture_notifications
+from website.util import web_url_for
+from tests.test_webtests import assert_in_html, assert_not_in_html
+
+class TestForgotPassword(OsfTestCase):
+
+ def setUp(self):
+ super().setUp()
+ self.user = UserFactory()
+ self.auth_user = AuthUserFactory()
+ self.get_url = web_url_for('forgot_password_get')
+ self.post_url = web_url_for('forgot_password_post')
+ self.user.verification_key_v2 = {}
+ self.user.save()
+
+
+ # log users out before they land on forgot password page
+ def test_forgot_password_logs_out_user(self):
+ # visit forgot password link while another user is logged in
+ res = self.app.get(self.get_url, auth=self.auth_user.auth)
+ # check redirection to CAS logout
+ assert res.status_code == 302
+ location = res.headers.get('Location')
+ assert 'reauth' not in location
+ assert 'logout?service=' in location
+ assert 'forgotpassword' in location
+
+ # test that forgot password page is loaded correctly
+ def test_get_forgot_password(self):
+ res = self.app.get(self.get_url)
+ assert res.status_code == 200
+ assert 'Forgot Password' in res.text
+ assert res.get_form('forgotPasswordForm')
+
+ # test that existing user can receive reset password email
+ def test_can_receive_reset_password_email(self):
+ # load forgot password page and submit email
+ res = self.app.get(self.get_url)
+ form = res.get_form('forgotPasswordForm')
+ form['forgot_password-email'] = self.user.username
+ with capture_notifications() as notifications:
+ res = form.submit(self.app)
+ # check mail was sent
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
+ # check http 200 response
+ assert res.status_code == 200
+ # check request URL is /forgotpassword
+ assert res.request.path == self.post_url
+ # check push notification
+ assert_in_html('If there is an OSF account', res.text)
+ assert_not_in_html('Please wait', res.text)
+
+ # check verification_key_v2 is set
+ self.user.reload()
+ assert self.user.verification_key_v2 != {}
+
+ # test that non-existing user cannot receive reset password email
+ def test_cannot_receive_reset_password_email(self):
+ # load forgot password page and submit email
+ res = self.app.get(self.get_url)
+ form = res.get_form('forgotPasswordForm')
+ form['forgot_password-email'] = 'fake' + self.user.username
+ with capture_notifications() as noifications:
+ res = form.submit(self.app)
+
+ # check mail was not sent
+ assert not noifications
+ # check http 200 response
+ assert res.status_code == 200
+ # check request URL is /forgotpassword
+ assert res.request.path == self.post_url
+ # check push notification
+ assert_in_html('If there is an OSF account', res.text)
+ assert_not_in_html('Please wait', res.text)
+
+ # check verification_key_v2 is not set
+ self.user.reload()
+ assert self.user.verification_key_v2 == {}
+
+ # test that non-existing user cannot receive reset password email
+ def test_not_active_user_no_reset_password_email(self):
+ self.user.deactivate_account()
+ self.user.save()
+
+ # load forgot password page and submit email
+ res = self.app.get(self.get_url)
+ form = res.get_form('forgotPasswordForm')
+ form['forgot_password-email'] = self.user.username
+ with capture_notifications() as notification:
+ res = form.submit(self.app)
+
+ # check mail was not sent
+ assert not notification
+ # check http 200 response
+ assert res.status_code == 200
+ # check request URL is /forgotpassword
+ assert res.request.path == self.post_url
+ # check push notification
+ assert_in_html('If there is an OSF account', res.text)
+ assert_not_in_html('Please wait', res.text)
+
+ # check verification_key_v2 is not set
+ self.user.reload()
+ assert self.user.verification_key_v2 == {}
+
+ # test that user cannot submit forgot password request too quickly
+ def test_cannot_reset_password_twice_quickly(self):
+ # load forgot password page and submit email
+ res = self.app.get(self.get_url)
+ form = res.get_form('forgotPasswordForm')
+ form['forgot_password-email'] = self.user.username
+ res = form.submit(self.app)
+ res = form.submit(self.app)
+
+ # check http 200 response
+ assert res.status_code == 200
+ # check push notification
+ assert_in_html('Please wait', res.text)
+ assert_not_in_html('If there is an OSF account', res.text)
+
+
+class TestForgotPasswordInstitution(OsfTestCase):
+
+ def setUp(self):
+ super().setUp()
+ self.user = UserFactory()
+ self.auth_user = AuthUserFactory()
+ self.get_url = web_url_for('redirect_unsupported_institution')
+ self.post_url = web_url_for('forgot_password_institution_post')
+ self.user.verification_key_v2 = {}
+ self.user.save()
+
+
+ # log users out before they land on institutional forgot password page
+ def test_forgot_password_logs_out_user(self):
+ # TODO: check in qa url encoding
+ # visit forgot password link while another user is logged in
+ res = self.app.get(self.get_url, auth=self.auth_user.auth)
+ # check redirection to CAS logout
+ assert res.status_code == 302
+ location = res.headers.get('Location')
+ assert quote_plus('campaign=unsupportedinstitution') in location
+ assert 'logout?service=' in location
+
+ # test that institutional forgot password page redirects to CAS unsupported
+ # institution page
+ def test_get_forgot_password(self):
+ res = self.app.get(self.get_url)
+ assert res.status_code == 302
+ location = res.headers.get('Location')
+ assert 'campaign=unsupportedinstitution' in location
+
+ # test that user from disabled institution can receive reset password email
+ def test_can_receive_reset_password_email(self):
+ # submit email to institutional forgot-password page
+
+ with capture_notifications() as notifications:
+ res = self.app.post(self.post_url, data={'forgot_password-email': self.user.username})
+
+ # check mail was sent
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
+ # check http 200 response
+ assert res.status_code == 200
+ # check request URL is /forgotpassword
+ assert res.request.path == self.post_url
+ # check push notification
+ assert_in_html('If there is an OSF account', res.text)
+ assert_not_in_html('Please wait', res.text)
+
+ # check verification_key_v2 is set
+ self.user.reload()
+ assert self.user.verification_key_v2 != {}
+
+ # test that non-existing user cannot receive reset password email
+ def test_cannot_receive_reset_password_email(self):
+ # load forgot password page and submit email
+
+ with capture_notifications() as noifications:
+ res = self.app.post(self.post_url, data={'forgot_password-email': 'fake' + self.user.username})
+ # check mail was not sent
+ assert not noifications
+ # check http 200 response
+ assert res.status_code == 200
+ # check request URL is /forgotpassword-institution
+ assert res.request.path == self.post_url
+ # check push notification
+ assert_in_html('If there is an OSF account', res.text)
+ assert_not_in_html('Please wait', res.text)
+
+ # check verification_key_v2 is not set
+ self.user.reload()
+ assert self.user.verification_key_v2 == {}
+
+ # test that non-existing user cannot receive institutional reset password email
+ def test_not_active_user_no_reset_password_email(self):
+ self.user.deactivate_account()
+ self.user.save()
+
+ with capture_notifications() as notification:
+ res = self.app.post(self.post_url, data={'forgot_password-email': self.user.username})
+
+ # check mail was not sent
+ assert not notification
+ # check http 200 response
+ assert res.status_code == 200
+ # check request URL is /forgotpassword-institution
+ assert res.request.path == self.post_url
+ # check push notification
+ assert_in_html('If there is an OSF account', res.text)
+ assert_not_in_html('Please wait', res.text)
+
+ # check verification_key_v2 is not set
+ self.user.reload()
+ assert self.user.verification_key_v2 == {}
+
+ # test that user cannot submit forgot password request too quickly
+ def test_cannot_reset_password_twice_quickly(self):
+ # submit institutional forgot-password request in rapid succession
+ res = self.app.post(self.post_url, data={'forgot_password-email': self.user.username})
+ res = self.app.post(self.post_url, data={'forgot_password-email': self.user.username})
+
+ # check http 200 response
+ assert res.status_code == 200
+ # check push notification
+ assert_in_html('Please wait', res.text)
+ assert_not_in_html('If there is an OSF account', res.text)
+
diff --git a/tests/test_preprints.py b/tests/test_preprints.py
index 728fb1fe1c8..df1be915bab 100644
--- a/tests/test_preprints.py
+++ b/tests/test_preprints.py
@@ -26,7 +26,7 @@
from addons.base import views
from admin_tests.utilities import setup_view
from api.preprints.views import PreprintContributorDetail
-from osf.models import Tag, Preprint, PreprintLog, PreprintContributor
+from osf.models import Tag, Preprint, PreprintLog, PreprintContributor, NotificationType
from osf.exceptions import PreprintStateError, ValidationError, ValidationValueError
from osf_tests.factories import (
ProjectFactory,
@@ -43,7 +43,7 @@
from osf.utils.permissions import READ, WRITE, ADMIN
from osf.utils.workflows import DefaultStates, RequestTypes, ReviewStates
from tests.base import assert_datetime_equal, OsfTestCase
-from tests.utils import assert_preprint_logs
+from tests.utils import assert_preprint_logs, capture_notifications
from website import settings, mails
from website.identifiers.clients import CrossRefClient, ECSArXivCrossRefClient, crossref
from website.identifiers.utils import request_identifiers
@@ -1999,13 +1999,15 @@ def setUp(self):
self.mock_send_grid = start_mock_send_grid(self)
def test_creator_gets_email(self):
- self.preprint.set_published(True, auth=Auth(self.user), save=True)
- domain = self.preprint.provider.domain or settings.DOMAIN
- self.mock_send_grid.assert_called()
- assert self.mock_send_grid.call_count == 1
-
- self.preprint_branded.set_published(True, auth=Auth(self.user), save=True)
- assert self.mock_send_grid.call_count == 2
+ with capture_notifications() as notifications:
+ self.preprint.set_published(True, auth=Auth(self.user), save=True)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
+
+ with capture_notifications() as notifications:
+ self.preprint_branded.set_published(True, auth=Auth(self.user), save=True)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
class TestPreprintOsfStorage(OsfTestCase):
diff --git a/tests/test_resend_confirmation.py b/tests/test_resend_confirmation.py
new file mode 100644
index 00000000000..95609e5ad76
--- /dev/null
+++ b/tests/test_resend_confirmation.py
@@ -0,0 +1,83 @@
+from osf.models import NotificationType
+from tests.base import OsfTestCase
+from osf_tests.factories import (
+ UserFactory,
+ UnconfirmedUserFactory,
+)
+from tests.utils import capture_notifications
+from website.util import web_url_for
+from tests.test_webtests import assert_in_html
+
+class TestResendConfirmation(OsfTestCase):
+
+ def setUp(self):
+ super().setUp()
+ self.unconfirmed_user = UnconfirmedUserFactory()
+ self.confirmed_user = UserFactory()
+ self.get_url = web_url_for('resend_confirmation_get')
+ self.post_url = web_url_for('resend_confirmation_post')
+
+ # test that resend confirmation page is load correctly
+ def test_resend_confirmation_get(self):
+ res = self.app.get(self.get_url)
+ assert res.status_code == 200
+ assert 'Resend Confirmation' in res.text
+ assert res.get_form('resendForm')
+
+ # test that unconfirmed user can receive resend confirmation email
+ def test_can_receive_resend_confirmation_email(self):
+ # load resend confirmation page and submit email
+ res = self.app.get(self.get_url)
+ form = res.get_form('resendForm')
+ form['email'] = self.unconfirmed_user.unconfirmed_emails[0]
+ with capture_notifications() as notifications:
+ res = form.submit(self.app)
+ # check email, request and response
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_INITIAL_CONFIRM_EMAIL
+ assert res.status_code == 200
+ assert res.request.path == self.post_url
+ assert_in_html('If there is an OSF account', res.text)
+
+
+ # test that confirmed user cannot receive resend confirmation email
+ def test_cannot_receive_resend_confirmation_email_1(self):
+ # load resend confirmation page and submit email
+ res = self.app.get(self.get_url)
+ form = res.get_form('resendForm')
+ form['email'] = self.confirmed_user.emails.first().address
+ with capture_notifications() as notifications:
+ res = form.submit(self.app)
+
+ assert not notifications
+ assert res.status_code == 200
+ assert res.request.path == self.post_url
+ assert_in_html('has already been confirmed', res.text)
+
+ # test that non-existing user cannot receive resend confirmation email
+ def test_cannot_receive_resend_confirmation_email_2(self):
+ # load resend confirmation page and submit email
+ res = self.app.get(self.get_url)
+ form = res.get_form('resendForm')
+ form['email'] = 'random@random.com'
+ with capture_notifications() as notifications:
+ res = form.submit(self.app)
+ # check email, request and response
+ assert notifications
+ assert res.status_code == 200
+ assert res.request.path == self.post_url
+ assert_in_html('If there is an OSF account', res.text)
+
+ # test that user cannot submit resend confirmation request too quickly
+ def test_cannot_resend_confirmation_twice_quickly(self):
+ # load resend confirmation page and submit email
+ res = self.app.get(self.get_url)
+ form = res.get_form('resendForm')
+ form['email'] = self.unconfirmed_user.email
+ form.submit(self.app)
+ res = form.submit(self.app)
+
+ # check request and response
+ assert res.status_code == 200
+ assert_in_html('Please wait', res.text)
+
diff --git a/tests/test_user_claiming.py b/tests/test_user_claiming.py
new file mode 100644
index 00000000000..8174a5600b5
--- /dev/null
+++ b/tests/test_user_claiming.py
@@ -0,0 +1,267 @@
+from rest_framework import status
+import unittest
+
+import pytest
+from framework.auth import exceptions
+from framework.auth.core import Auth
+from tests.base import OsfTestCase
+from tests.base import fake
+from osf_tests.factories import (
+ fake_email,
+ AuthUserFactory,
+ PreprintFactory,
+ ProjectFactory,
+ UserFactory,
+ UnconfirmedUserFactory,
+ UnregUserFactory,
+)
+from tests.test_webtests import assert_in_html
+from website import language
+from website.util import api_url_for
+
+@pytest.mark.enable_bookmark_creation
+@pytest.mark.enable_implicit_clean
+class TestClaiming(OsfTestCase):
+
+ def setUp(self):
+ super().setUp()
+ self.referrer = AuthUserFactory()
+ self.project = ProjectFactory(creator=self.referrer, is_public=True)
+
+ def test_correct_name_shows_in_contributor_list(self):
+ name1, email = fake.name(), fake_email()
+ UnregUserFactory(fullname=name1, email=email)
+ name2, email = fake.name(), fake_email()
+ # Added with different name
+ self.project.add_unregistered_contributor(fullname=name2,
+ email=email, auth=Auth(self.referrer))
+ self.project.save()
+
+ res = self.app.get(self.project.url, auth=self.referrer.auth)
+ # Correct name is shown
+ assert_in_html(name2, res.text)
+ assert name1 not in res.text
+
+ def test_user_can_set_password_on_claim_page(self):
+ name, email = fake.name(), fake_email()
+ new_user = self.project.add_unregistered_contributor(
+ email=email,
+ fullname=name,
+ auth=Auth(self.referrer)
+ )
+ self.project.save()
+ claim_url = new_user.get_claim_url(self.project._primary_key)
+ res = self.app.get(claim_url)
+ self.project.reload()
+ assert 'Set Password' in res.text
+ form = res.get_form('setPasswordForm')
+ #form['username'] = new_user.username #Removed as long as E-mail can't be updated.
+ form['password'] = 'killerqueen'
+ form['password2'] = 'killerqueen'
+ self.app.resolve_redirect(form.submit(self.app))
+ new_user.reload()
+ assert new_user.check_password('killerqueen')
+
+ def test_sees_is_redirected_if_user_already_logged_in(self):
+ name, email = fake.name(), fake_email()
+ new_user = self.project.add_unregistered_contributor(
+ email=email,
+ fullname=name,
+ auth=Auth(self.referrer)
+ )
+ self.project.save()
+ existing = AuthUserFactory()
+ claim_url = new_user.get_claim_url(self.project._primary_key)
+ # a user is already logged in
+ res = self.app.get(claim_url, auth=existing.auth)
+ assert res.status_code == 302
+
+ def test_unregistered_users_names_are_project_specific(self):
+ name1, name2, email = fake.name(), fake.name(), fake_email()
+ project2 = ProjectFactory(creator=self.referrer)
+ # different projects use different names for the same unreg contributor
+ self.project.add_unregistered_contributor(
+ email=email,
+ fullname=name1,
+ auth=Auth(self.referrer)
+ )
+ self.project.save()
+ project2.add_unregistered_contributor(
+ email=email,
+ fullname=name2,
+ auth=Auth(self.referrer)
+ )
+ project2.save()
+ # Each project displays a different name in the contributor list
+ res = self.app.get(self.project.url, auth=self.referrer.auth)
+ assert_in_html(name1, res.text)
+
+ res2 = self.app.get(project2.url, auth=self.referrer.auth)
+ assert_in_html(name2, res2.text)
+
+ @unittest.skip('as long as E-mails cannot be changed')
+ def test_cannot_set_email_to_a_user_that_already_exists(self):
+ reg_user = UserFactory()
+ name, email = fake.name(), fake_email()
+ new_user = self.project.add_unregistered_contributor(
+ email=email,
+ fullname=name,
+ auth=Auth(self.referrer)
+ )
+ self.project.save()
+ # Goes to claim url and successfully claims account
+ claim_url = new_user.get_claim_url(self.project._primary_key)
+ res = self.app.get(claim_url)
+ self.project.reload()
+ assert 'Set Password' in res
+ form = res.get_form('setPasswordForm')
+ # Fills out an email that is the username of another user
+ form['username'] = reg_user.username
+ form['password'] = 'killerqueen'
+ form['password2'] = 'killerqueen'
+ res = form.submit(follow_redirects=True)
+ assert language.ALREADY_REGISTERED.format(email=reg_user.username) in res.text
+
+ def test_correct_display_name_is_shown_at_claim_page(self):
+ original_name = fake.name()
+ unreg = UnregUserFactory(fullname=original_name)
+
+ different_name = fake.name()
+ new_user = self.project.add_unregistered_contributor(
+ email=unreg.username,
+ fullname=different_name,
+ auth=Auth(self.referrer),
+ )
+ self.project.save()
+ claim_url = new_user.get_claim_url(self.project._primary_key)
+ res = self.app.get(claim_url)
+ # Correct name (different_name) should be on page
+ assert_in_html(different_name, res.text)
+
+
+class TestConfirmingEmail(OsfTestCase):
+
+ def setUp(self):
+ super().setUp()
+ self.user = UnconfirmedUserFactory()
+ self.confirmation_url = self.user.get_confirmation_url(
+ self.user.username,
+ external=False,
+ )
+ self.confirmation_token = self.user.get_confirmation_token(
+ self.user.username
+ )
+
+ def test_cannot_remove_another_user_email(self):
+ user1 = AuthUserFactory()
+ user2 = AuthUserFactory()
+ url = api_url_for('update_user')
+ header = {'id': user1.username, 'emails': [{'address': user1.username}]}
+ res = self.app.put(url, json=header, auth=user2.auth)
+ assert res.status_code == 403
+
+ def test_cannnot_make_primary_email_for_another_user(self):
+ user1 = AuthUserFactory()
+ user2 = AuthUserFactory()
+ email = 'test@cos.io'
+ user1.emails.create(address=email)
+ user1.save()
+ url = api_url_for('update_user')
+ header = {'id': user1.username,
+ 'emails': [{'address': user1.username, 'primary': False, 'confirmed': True},
+ {'address': email, 'primary': True, 'confirmed': True}
+ ]}
+ res = self.app.put(url, json=header, auth=user2.auth)
+ assert res.status_code == 403
+
+ def test_cannnot_add_email_for_another_user(self):
+ user1 = AuthUserFactory()
+ user2 = AuthUserFactory()
+ email = 'test@cos.io'
+ url = api_url_for('update_user')
+ header = {'id': user1.username,
+ 'emails': [{'address': user1.username, 'primary': True, 'confirmed': True},
+ {'address': email, 'primary': False, 'confirmed': False}
+ ]}
+ res = self.app.put(url, json=header, auth=user2.auth)
+ assert res.status_code == 403
+
+ def test_error_page_if_confirm_link_is_used(self):
+ self.user.confirm_email(self.confirmation_token)
+ self.user.save()
+ res = self.app.get(self.confirmation_url)
+
+ assert exceptions.InvalidTokenError.message_short in res.text
+ assert res.status_code == status.HTTP_400_BAD_REQUEST
+
+
+@pytest.mark.enable_implicit_clean
+@pytest.mark.enable_bookmark_creation
+class TestClaimingAsARegisteredUser(OsfTestCase):
+
+ def setUp(self):
+ super().setUp()
+ self.referrer = AuthUserFactory()
+ self.project = ProjectFactory(creator=self.referrer, is_public=True)
+ name, email = fake.name(), fake_email()
+ self.user = self.project.add_unregistered_contributor(
+ fullname=name,
+ email=email,
+ auth=Auth(user=self.referrer)
+ )
+ self.project.save()
+
+ def test_claim_user_registered_with_correct_password(self):
+ reg_user = AuthUserFactory() # NOTE: AuthUserFactory sets password as 'queenfan86'
+ url = self.user.get_claim_url(self.project._primary_key)
+ # Follow to password re-enter page
+ res = self.app.get(url, auth=reg_user.auth, follow_redirects=True)
+
+ # verify that the "Claim Account" form is returned
+ assert 'Claim Contributor' in res.text
+
+ form = res.get_form('claimContributorForm')
+ form['password'] = 'queenfan86'
+ res = form.submit(self.app, auth=reg_user.auth)
+ self.app.resolve_redirect(res)
+ self.project.reload()
+ self.user.reload()
+ # user is now a contributor to the project
+ assert reg_user in self.project.contributors
+
+ # the unregistered user (self.user) is removed as a contributor, and their
+ assert self.user not in self.project.contributors
+
+ # unclaimed record for the project has been deleted
+ assert self.project not in self.user.unclaimed_records
+
+ def test_claim_user_registered_preprint_with_correct_password(self):
+ preprint = PreprintFactory(creator=self.referrer)
+ name, email = fake.name(), fake_email()
+ unreg_user = preprint.add_unregistered_contributor(
+ fullname=name,
+ email=email,
+ auth=Auth(user=self.referrer)
+ )
+ reg_user = AuthUserFactory() # NOTE: AuthUserFactory sets password as 'queenfan86'
+ url = unreg_user.get_claim_url(preprint._id)
+ # Follow to password re-enter page
+ res = self.app.get(url, auth=reg_user.auth, follow_redirects=True)
+
+ # verify that the "Claim Account" form is returned
+ assert 'Claim Contributor' in res.text
+
+ form = res.get_form('claimContributorForm')
+ form['password'] = 'queenfan86'
+ res = form.submit(self.app, auth=reg_user.auth)
+
+ preprint.reload()
+ unreg_user.reload()
+ # user is now a contributor to the project
+ assert reg_user in preprint.contributors
+
+ # the unregistered user (unreg_user) is removed as a contributor, and their
+ assert unreg_user not in preprint.contributors
+
+ # unclaimed record for the project has been deleted
+ assert preprint not in unreg_user.unclaimed_records
diff --git a/tests/test_webtests.py b/tests/test_webtests.py
index c55e6b523f4..92cf8b6f2f5 100644
--- a/tests/test_webtests.py
+++ b/tests/test_webtests.py
@@ -1,8 +1,5 @@
#!/usr/bin/env python3
"""Functional tests using WebTest."""
-from urllib.parse import quote_plus
-
-from rest_framework import status
import logging
import unittest
@@ -13,30 +10,22 @@
from bs4 import BeautifulSoup
from django.utils import timezone
from addons.wiki.utils import to_mongo_key
-from framework.auth import exceptions
from framework.auth.core import Auth
from tests.base import OsfTestCase
-from tests.base import fake
from osf_tests.factories import (
- fake_email,
AuthUserFactory,
NodeFactory,
PreprintFactory,
PreprintProviderFactory,
PrivateLinkFactory,
ProjectFactory,
- RegistrationFactory,
SubjectFactory,
UserFactory,
- UnconfirmedUserFactory,
- UnregUserFactory,
)
from osf.utils import permissions
from addons.wiki.models import WikiPage, WikiVersion
from addons.wiki.tests.factories import WikiFactory, WikiVersionFactory
-from website import language
-from website.util import web_url_for, api_url_for
-from conftest import start_mock_send_grid, start_mock_notification_send
+from website.util import web_url_for
logging.getLogger('website.project.model').setLevel(logging.ERROR)
@@ -205,7 +194,7 @@ def test_wiki_content(self):
user=self.user,
node=project,
)
- wiki = WikiVersionFactory(
+ WikiVersionFactory(
wiki_page=wiki_page,
content=wiki_content
)
@@ -467,633 +456,6 @@ def test_wiki_url(self):
assert self._url_to_body(self.wiki.deep_url) == self._url_to_body(self.wiki.url)
-@pytest.mark.enable_bookmark_creation
-@pytest.mark.enable_implicit_clean
-class TestClaiming(OsfTestCase):
-
- def setUp(self):
- super().setUp()
- self.referrer = AuthUserFactory()
- self.project = ProjectFactory(creator=self.referrer, is_public=True)
-
- def test_correct_name_shows_in_contributor_list(self):
- name1, email = fake.name(), fake_email()
- UnregUserFactory(fullname=name1, email=email)
- name2, email = fake.name(), fake_email()
- # Added with different name
- self.project.add_unregistered_contributor(fullname=name2,
- email=email, auth=Auth(self.referrer))
- self.project.save()
-
- res = self.app.get(self.project.url, auth=self.referrer.auth)
- # Correct name is shown
- assert_in_html(name2, res.text)
- assert name1 not in res.text
-
- def test_user_can_set_password_on_claim_page(self):
- name, email = fake.name(), fake_email()
- new_user = self.project.add_unregistered_contributor(
- email=email,
- fullname=name,
- auth=Auth(self.referrer)
- )
- self.project.save()
- claim_url = new_user.get_claim_url(self.project._primary_key)
- res = self.app.get(claim_url)
- self.project.reload()
- assert 'Set Password' in res.text
- form = res.get_form('setPasswordForm')
- #form['username'] = new_user.username #Removed as long as E-mail can't be updated.
- form['password'] = 'killerqueen'
- form['password2'] = 'killerqueen'
- self.app.resolve_redirect(form.submit(self.app))
- new_user.reload()
- assert new_user.check_password('killerqueen')
-
- def test_sees_is_redirected_if_user_already_logged_in(self):
- name, email = fake.name(), fake_email()
- new_user = self.project.add_unregistered_contributor(
- email=email,
- fullname=name,
- auth=Auth(self.referrer)
- )
- self.project.save()
- existing = AuthUserFactory()
- claim_url = new_user.get_claim_url(self.project._primary_key)
- # a user is already logged in
- res = self.app.get(claim_url, auth=existing.auth)
- assert res.status_code == 302
-
- def test_unregistered_users_names_are_project_specific(self):
- name1, name2, email = fake.name(), fake.name(), fake_email()
- project2 = ProjectFactory(creator=self.referrer)
- # different projects use different names for the same unreg contributor
- self.project.add_unregistered_contributor(
- email=email,
- fullname=name1,
- auth=Auth(self.referrer)
- )
- self.project.save()
- project2.add_unregistered_contributor(
- email=email,
- fullname=name2,
- auth=Auth(self.referrer)
- )
- project2.save()
- # Each project displays a different name in the contributor list
- res = self.app.get(self.project.url, auth=self.referrer.auth)
- assert_in_html(name1, res.text)
-
- res2 = self.app.get(project2.url, auth=self.referrer.auth)
- assert_in_html(name2, res2.text)
-
- @unittest.skip('as long as E-mails cannot be changed')
- def test_cannot_set_email_to_a_user_that_already_exists(self):
- reg_user = UserFactory()
- name, email = fake.name(), fake_email()
- new_user = self.project.add_unregistered_contributor(
- email=email,
- fullname=name,
- auth=Auth(self.referrer)
- )
- self.project.save()
- # Goes to claim url and successfully claims account
- claim_url = new_user.get_claim_url(self.project._primary_key)
- res = self.app.get(claim_url)
- self.project.reload()
- assert 'Set Password' in res
- form = res.get_form('setPasswordForm')
- # Fills out an email that is the username of another user
- form['username'] = reg_user.username
- form['password'] = 'killerqueen'
- form['password2'] = 'killerqueen'
- res = form.submit(follow_redirects=True)
- assert language.ALREADY_REGISTERED.format(email=reg_user.username) in res.text
-
- def test_correct_display_name_is_shown_at_claim_page(self):
- original_name = fake.name()
- unreg = UnregUserFactory(fullname=original_name)
-
- different_name = fake.name()
- new_user = self.project.add_unregistered_contributor(
- email=unreg.username,
- fullname=different_name,
- auth=Auth(self.referrer),
- )
- self.project.save()
- claim_url = new_user.get_claim_url(self.project._primary_key)
- res = self.app.get(claim_url)
- # Correct name (different_name) should be on page
- assert_in_html(different_name, res.text)
-
-
-class TestConfirmingEmail(OsfTestCase):
-
- def setUp(self):
- super().setUp()
- self.user = UnconfirmedUserFactory()
- self.confirmation_url = self.user.get_confirmation_url(
- self.user.username,
- external=False,
- )
- self.confirmation_token = self.user.get_confirmation_token(
- self.user.username
- )
-
- def test_cannot_remove_another_user_email(self):
- user1 = AuthUserFactory()
- user2 = AuthUserFactory()
- url = api_url_for('update_user')
- header = {'id': user1.username, 'emails': [{'address': user1.username}]}
- res = self.app.put(url, json=header, auth=user2.auth)
- assert res.status_code == 403
-
- def test_cannnot_make_primary_email_for_another_user(self):
- user1 = AuthUserFactory()
- user2 = AuthUserFactory()
- email = 'test@cos.io'
- user1.emails.create(address=email)
- user1.save()
- url = api_url_for('update_user')
- header = {'id': user1.username,
- 'emails': [{'address': user1.username, 'primary': False, 'confirmed': True},
- {'address': email, 'primary': True, 'confirmed': True}
- ]}
- res = self.app.put(url, json=header, auth=user2.auth)
- assert res.status_code == 403
-
- def test_cannnot_add_email_for_another_user(self):
- user1 = AuthUserFactory()
- user2 = AuthUserFactory()
- email = 'test@cos.io'
- url = api_url_for('update_user')
- header = {'id': user1.username,
- 'emails': [{'address': user1.username, 'primary': True, 'confirmed': True},
- {'address': email, 'primary': False, 'confirmed': False}
- ]}
- res = self.app.put(url, json=header, auth=user2.auth)
- assert res.status_code == 403
-
- def test_error_page_if_confirm_link_is_used(self):
- self.user.confirm_email(self.confirmation_token)
- self.user.save()
- res = self.app.get(self.confirmation_url)
-
- assert exceptions.InvalidTokenError.message_short in res.text
- assert res.status_code == status.HTTP_400_BAD_REQUEST
-
-
-@pytest.mark.enable_implicit_clean
-@pytest.mark.enable_bookmark_creation
-class TestClaimingAsARegisteredUser(OsfTestCase):
-
- def setUp(self):
- super().setUp()
- self.referrer = AuthUserFactory()
- self.project = ProjectFactory(creator=self.referrer, is_public=True)
- name, email = fake.name(), fake_email()
- self.user = self.project.add_unregistered_contributor(
- fullname=name,
- email=email,
- auth=Auth(user=self.referrer)
- )
- self.project.save()
-
- def test_claim_user_registered_with_correct_password(self):
- reg_user = AuthUserFactory() # NOTE: AuthUserFactory sets password as 'queenfan86'
- url = self.user.get_claim_url(self.project._primary_key)
- # Follow to password re-enter page
- res = self.app.get(url, auth=reg_user.auth, follow_redirects=True)
-
- # verify that the "Claim Account" form is returned
- assert 'Claim Contributor' in res.text
-
- form = res.get_form('claimContributorForm')
- form['password'] = 'queenfan86'
- res = form.submit(self.app, auth=reg_user.auth)
- self.app.resolve_redirect(res)
- self.project.reload()
- self.user.reload()
- # user is now a contributor to the project
- assert reg_user in self.project.contributors
-
- # the unregistered user (self.user) is removed as a contributor, and their
- assert self.user not in self.project.contributors
-
- # unclaimed record for the project has been deleted
- assert self.project not in self.user.unclaimed_records
-
- def test_claim_user_registered_preprint_with_correct_password(self):
- preprint = PreprintFactory(creator=self.referrer)
- name, email = fake.name(), fake_email()
- unreg_user = preprint.add_unregistered_contributor(
- fullname=name,
- email=email,
- auth=Auth(user=self.referrer)
- )
- reg_user = AuthUserFactory() # NOTE: AuthUserFactory sets password as 'queenfan86'
- url = unreg_user.get_claim_url(preprint._id)
- # Follow to password re-enter page
- res = self.app.get(url, auth=reg_user.auth, follow_redirects=True)
-
- # verify that the "Claim Account" form is returned
- assert 'Claim Contributor' in res.text
-
- form = res.get_form('claimContributorForm')
- form['password'] = 'queenfan86'
- res = form.submit(self.app, auth=reg_user.auth)
-
- preprint.reload()
- unreg_user.reload()
- # user is now a contributor to the project
- assert reg_user in preprint.contributors
-
- # the unregistered user (unreg_user) is removed as a contributor, and their
- assert unreg_user not in preprint.contributors
-
- # unclaimed record for the project has been deleted
- assert preprint not in unreg_user.unclaimed_records
-
-
-@mock.patch('website.mails.settings.USE_EMAIL', True)
-@mock.patch('website.mails.settings.USE_CELERY', False)
-class TestResendConfirmation(OsfTestCase):
-
- def setUp(self):
- super().setUp()
- self.unconfirmed_user = UnconfirmedUserFactory()
- self.confirmed_user = UserFactory()
- self.get_url = web_url_for('resend_confirmation_get')
- self.post_url = web_url_for('resend_confirmation_post')
-
- self.mock_send_grid = start_mock_send_grid(self)
-
- # test that resend confirmation page is load correctly
- def test_resend_confirmation_get(self):
- res = self.app.get(self.get_url)
- assert res.status_code == 200
- assert 'Resend Confirmation' in res.text
- assert res.get_form('resendForm')
-
- # test that unconfirmed user can receive resend confirmation email
- def test_can_receive_resend_confirmation_email(self):
- # load resend confirmation page and submit email
- res = self.app.get(self.get_url)
- form = res.get_form('resendForm')
- form['email'] = self.unconfirmed_user.unconfirmed_emails[0]
- res = form.submit(self.app)
-
- # check email, request and response
- assert self.mock_send_grid.called
- assert res.status_code == 200
- assert res.request.path == self.post_url
- assert_in_html('If there is an OSF account', res.text)
-
- # test that confirmed user cannot receive resend confirmation email
- def test_cannot_receive_resend_confirmation_email_1(self):
- # load resend confirmation page and submit email
- res = self.app.get(self.get_url)
- form = res.get_form('resendForm')
- form['email'] = self.confirmed_user.emails.first().address
- res = form.submit(self.app)
-
- # check email, request and response
- assert not self.mock_send_grid.called
- assert res.status_code == 200
- assert res.request.path == self.post_url
- assert_in_html('has already been confirmed', res.text)
-
- # test that non-existing user cannot receive resend confirmation email
- def test_cannot_receive_resend_confirmation_email_2(self):
- # load resend confirmation page and submit email
- res = self.app.get(self.get_url)
- form = res.get_form('resendForm')
- form['email'] = 'random@random.com'
- res = form.submit(self.app)
-
- # check email, request and response
- assert not self.mock_send_grid.called
- assert res.status_code == 200
- assert res.request.path == self.post_url
- assert_in_html('If there is an OSF account', res.text)
-
- # test that user cannot submit resend confirmation request too quickly
- def test_cannot_resend_confirmation_twice_quickly(self):
- # load resend confirmation page and submit email
- res = self.app.get(self.get_url)
- form = res.get_form('resendForm')
- form['email'] = self.unconfirmed_user.email
- res = form.submit(self.app)
- res = form.submit(self.app)
-
- # check request and response
- assert res.status_code == 200
- assert_in_html('Please wait', res.text)
-
-
-@mock.patch('website.mails.settings.USE_EMAIL', True)
-@mock.patch('website.mails.settings.USE_CELERY', False)
-class TestForgotPassword(OsfTestCase):
-
- def setUp(self):
- super().setUp()
- self.user = UserFactory()
- self.auth_user = AuthUserFactory()
- self.get_url = web_url_for('forgot_password_get')
- self.post_url = web_url_for('forgot_password_post')
- self.user.verification_key_v2 = {}
- self.user.save()
-
- self.mock_send_grid = start_mock_send_grid(self)
- self.start_mock_notification_send = start_mock_notification_send(self)
-
- # log users out before they land on forgot password page
- def test_forgot_password_logs_out_user(self):
- # visit forgot password link while another user is logged in
- res = self.app.get(self.get_url, auth=self.auth_user.auth)
- # check redirection to CAS logout
- assert res.status_code == 302
- location = res.headers.get('Location')
- assert 'reauth' not in location
- assert 'logout?service=' in location
- assert 'forgotpassword' in location
-
- # test that forgot password page is loaded correctly
- def test_get_forgot_password(self):
- res = self.app.get(self.get_url)
- assert res.status_code == 200
- assert 'Forgot Password' in res.text
- assert res.get_form('forgotPasswordForm')
-
- # test that existing user can receive reset password email
- def test_can_receive_reset_password_email(self):
- # load forgot password page and submit email
- res = self.app.get(self.get_url)
- form = res.get_form('forgotPasswordForm')
- form['forgot_password-email'] = self.user.username
- res = form.submit(self.app)
-
- # check mail was sent
- assert self.start_mock_notification_send.called
- # check http 200 response
- assert res.status_code == 200
- # check request URL is /forgotpassword
- assert res.request.path == self.post_url
- # check push notification
- assert_in_html('If there is an OSF account', res.text)
- assert_not_in_html('Please wait', res.text)
-
- # check verification_key_v2 is set
- self.user.reload()
- assert self.user.verification_key_v2 != {}
-
- # test that non-existing user cannot receive reset password email
- def test_cannot_receive_reset_password_email(self):
- # load forgot password page and submit email
- res = self.app.get(self.get_url)
- form = res.get_form('forgotPasswordForm')
- form['forgot_password-email'] = 'fake' + self.user.username
- res = form.submit(self.app)
-
- # check mail was not sent
- assert not self.mock_send_grid.called
- # check http 200 response
- assert res.status_code == 200
- # check request URL is /forgotpassword
- assert res.request.path == self.post_url
- # check push notification
- assert_in_html('If there is an OSF account', res.text)
- assert_not_in_html('Please wait', res.text)
-
- # check verification_key_v2 is not set
- self.user.reload()
- assert self.user.verification_key_v2 == {}
-
- # test that non-existing user cannot receive reset password email
- def test_not_active_user_no_reset_password_email(self):
- self.user.deactivate_account()
- self.user.save()
-
- # load forgot password page and submit email
- res = self.app.get(self.get_url)
- form = res.get_form('forgotPasswordForm')
- form['forgot_password-email'] = self.user.username
- res = form.submit(self.app)
-
- # check mail was not sent
- assert not self.mock_send_grid.called
- # check http 200 response
- assert res.status_code == 200
- # check request URL is /forgotpassword
- assert res.request.path == self.post_url
- # check push notification
- assert_in_html('If there is an OSF account', res.text)
- assert_not_in_html('Please wait', res.text)
-
- # check verification_key_v2 is not set
- self.user.reload()
- assert self.user.verification_key_v2 == {}
-
- # test that user cannot submit forgot password request too quickly
- def test_cannot_reset_password_twice_quickly(self):
- # load forgot password page and submit email
- res = self.app.get(self.get_url)
- form = res.get_form('forgotPasswordForm')
- form['forgot_password-email'] = self.user.username
- res = form.submit(self.app)
- res = form.submit(self.app)
-
- # check http 200 response
- assert res.status_code == 200
- # check push notification
- assert_in_html('Please wait', res.text)
- assert_not_in_html('If there is an OSF account', res.text)
-
-
-@mock.patch('website.mails.settings.USE_EMAIL', True)
-@mock.patch('website.mails.settings.USE_CELERY', False)
-class TestForgotPasswordInstitution(OsfTestCase):
-
- def setUp(self):
- super().setUp()
- self.user = UserFactory()
- self.auth_user = AuthUserFactory()
- self.get_url = web_url_for('redirect_unsupported_institution')
- self.post_url = web_url_for('forgot_password_institution_post')
- self.user.verification_key_v2 = {}
- self.user.save()
-
- self.mock_send_grid = start_mock_send_grid(self)
- self.start_mock_notification_send = start_mock_notification_send(self)
-
- # log users out before they land on institutional forgot password page
- def test_forgot_password_logs_out_user(self):
- # TODO: check in qa url encoding
- # visit forgot password link while another user is logged in
- res = self.app.get(self.get_url, auth=self.auth_user.auth)
- # check redirection to CAS logout
- assert res.status_code == 302
- location = res.headers.get('Location')
- assert quote_plus('campaign=unsupportedinstitution') in location
- assert 'logout?service=' in location
-
- # test that institutional forgot password page redirects to CAS unsupported
- # institution page
- def test_get_forgot_password(self):
- res = self.app.get(self.get_url)
- assert res.status_code == 302
- location = res.headers.get('Location')
- assert 'campaign=unsupportedinstitution' in location
-
- # test that user from disabled institution can receive reset password email
- def test_can_receive_reset_password_email(self):
- # submit email to institutional forgot-password page
- res = self.app.post(self.post_url, data={'forgot_password-email': self.user.username})
-
- # check mail was sent
- assert self.start_mock_notification_send.called
- # check http 200 response
- assert res.status_code == 200
- # check request URL is /forgotpassword
- assert res.request.path == self.post_url
- # check push notification
- assert_in_html('If there is an OSF account', res.text)
- assert_not_in_html('Please wait', res.text)
-
- # check verification_key_v2 is set
- self.user.reload()
- assert self.user.verification_key_v2 != {}
-
- # test that non-existing user cannot receive reset password email
- def test_cannot_receive_reset_password_email(self):
- # load forgot password page and submit email
- res = self.app.post(self.post_url, data={'forgot_password-email': 'fake' + self.user.username})
-
- # check mail was not sent
- assert not self.mock_send_grid.called
- # check http 200 response
- assert res.status_code == 200
- # check request URL is /forgotpassword-institution
- assert res.request.path == self.post_url
- # check push notification
- assert_in_html('If there is an OSF account', res.text)
- assert_not_in_html('Please wait', res.text)
-
- # check verification_key_v2 is not set
- self.user.reload()
- assert self.user.verification_key_v2 == {}
-
- # test that non-existing user cannot receive institutional reset password email
- def test_not_active_user_no_reset_password_email(self):
- self.user.deactivate_account()
- self.user.save()
-
- res = self.app.post(self.post_url, data={'forgot_password-email': self.user.username})
-
- # check mail was not sent
- assert not self.mock_send_grid.called
- # check http 200 response
- assert res.status_code == 200
- # check request URL is /forgotpassword-institution
- assert res.request.path == self.post_url
- # check push notification
- assert_in_html('If there is an OSF account', res.text)
- assert_not_in_html('Please wait', res.text)
-
- # check verification_key_v2 is not set
- self.user.reload()
- assert self.user.verification_key_v2 == {}
-
- # test that user cannot submit forgot password request too quickly
- def test_cannot_reset_password_twice_quickly(self):
- # submit institutional forgot-password request in rapid succession
- res = self.app.post(self.post_url, data={'forgot_password-email': self.user.username})
- res = self.app.post(self.post_url, data={'forgot_password-email': self.user.username})
-
- # check http 200 response
- assert res.status_code == 200
- # check push notification
- assert_in_html('Please wait', res.text)
- assert_not_in_html('If there is an OSF account', res.text)
-
-
-@unittest.skip('Public projects/components are dynamically loaded now.')
-class TestAUserProfile(OsfTestCase):
-
- def setUp(self):
- OsfTestCase.setUp(self)
-
- self.user = AuthUserFactory()
- self.me = AuthUserFactory()
- self.project = ProjectFactory(creator=self.me, is_public=True, title=fake.bs())
- self.component = NodeFactory(creator=self.me, parent=self.project, is_public=True, title=fake.bs())
-
- # regression test for https://github.com/CenterForOpenScience/osf.io/issues/2623
- def test_has_public_projects_and_components(self):
- # I go to my own profile
- url = web_url_for('profile_view_id', uid=self.me._primary_key)
- # I see the title of both my project and component
- res = self.app.get(url, auth=self.me.auth)
- assert_in_html(self.component.title, res)
- assert_in_html(self.project.title, res)
-
- # Another user can also see my public project and component
- url = web_url_for('profile_view_id', uid=self.me._primary_key)
- # I see the title of both my project and component
- res = self.app.get(url, auth=self.user.auth)
- assert_in_html(self.component.title, res)
- assert_in_html(self.project.title, res)
-
- def test_shows_projects_with_many_contributors(self):
- # My project has many contributors
- for _ in range(5):
- user = UserFactory()
- self.project.add_contributor(user, auth=Auth(self.project.creator), save=True)
-
- # I go to my own profile
- url = web_url_for('profile_view_id', uid=self.me._primary_key)
- res = self.app.get(url, auth=self.me.auth)
- # I see '3 more' as a link
- assert '3 more' in res.text
-
- res = res.click('3 more')
- assert res.request.path == self.project.url
-
- def test_has_no_public_projects_or_components_on_own_profile(self):
- # User goes to their profile
- url = web_url_for('profile_view_id', uid=self.user._id)
- res = self.app.get(url, auth=self.user.auth)
-
- # user has no public components/projects
- assert 'You have no public projects' in res
- assert 'You have no public components' in res
-
- def test_user_no_public_projects_or_components(self):
- # I go to other user's profile
- url = web_url_for('profile_view_id', uid=self.user._id)
- # User has no public components/projects
- res = self.app.get(url, auth=self.me.auth)
- assert 'This user has no public projects' in res
- assert 'This user has no public components'in res
-
- # regression test
- def test_does_not_show_registrations(self):
- project = ProjectFactory(creator=self.user)
- component = NodeFactory(parent=project, creator=self.user, is_public=False)
- # User has a registration with public components
- reg = RegistrationFactory(project=component.parent_node, creator=self.user, is_public=True)
- for each in reg.nodes:
- each.is_public = True
- each.save()
- # I go to other user's profile
- url = web_url_for('profile_view_id', uid=self.user._id)
- # Registration does not appear on profile
- res = self.app.get(url, auth=self.me.auth)
- assert 'This user has no public components' in res
- assert reg.title not in res
- assert reg.nodes[0].title not in res
-
-
@pytest.mark.enable_bookmark_creation
class TestPreprintBannerView(OsfTestCase):
def setUp(self):
From 68b04946ef3226c341c3de9bfad55e586f6244cb Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Fri, 25 Jul 2025 15:44:14 -0400
Subject: [PATCH 121/336] more clean-up from self-CR
---
api/nodes/serializers.py | 12 ++++--------
api/nodes/views.py | 4 ++--
api/providers/serializers.py | 11 +++++------
api/providers/tasks.py | 26 ++++++++++++--------------
api/users/views.py | 31 +++++++++++++++++++++----------
tests/test_forgot_password.py | 2 +-
website/language.py | 3 +++
7 files changed, 48 insertions(+), 41 deletions(-)
diff --git a/api/nodes/serializers.py b/api/nodes/serializers.py
index 4154dfbb5f8..8725479456d 100644
--- a/api/nodes/serializers.py
+++ b/api/nodes/serializers.py
@@ -1256,15 +1256,11 @@ def create(self, validated_data):
try:
contributor_dict = {
- 'auth': auth,
- 'user_id': id,
- 'email': email,
- 'full_name': full_name,
- 'send_email': send_email,
- 'bibliographic': bibliographic,
- 'index': index,
- 'permissions': permissions,
+ 'auth': auth, 'user_id': id, 'email': email, 'full_name': full_name, 'send_email': send_email,
+ 'bibliographic': bibliographic, 'index': index,
}
+
+ contributor_dict['permissions'] = permissions
contributor_obj = node.add_contributor_registered_or_not(**contributor_dict)
except ValidationError as e:
raise exceptions.ValidationError(detail=e.messages[0])
diff --git a/api/nodes/views.py b/api/nodes/views.py
index 14e104b4de0..50ba08cb7fe 100644
--- a/api/nodes/views.py
+++ b/api/nodes/views.py
@@ -1051,7 +1051,7 @@ def perform_create(self, serializer):
user=user,
event_context={
'guid': node._id,
- 'title': node._id,
+ 'title': node.title,
'can_change_preferences': False,
},
)
@@ -1063,7 +1063,7 @@ def perform_create(self, serializer):
user=user,
event_context={
'guid': fork._id,
- 'title': node._id,
+ 'title': node.title,
'can_change_preferences': False,
},
)
diff --git a/api/providers/serializers.py b/api/providers/serializers.py
index b10f8290bd8..673e22a1b0e 100644
--- a/api/providers/serializers.py
+++ b/api/providers/serializers.py
@@ -347,10 +347,6 @@ def create(self, validated_data):
if bool(get_perms(user, provider)):
raise ValidationError('Specified user is already a moderator.')
- if 'claim_url' in context:
- template = NotificationType.Type.PROVIDER_CONFIRM_EMAIL_MODERATION
- else:
- template = NotificationType.Type.PROVIDER_MODERATOR_ADDED
perm_group = validated_data.pop('permission_group', '')
if perm_group not in REVIEW_GROUPS:
@@ -362,9 +358,12 @@ def create(self, validated_data):
provider.add_to_group(user, perm_group)
setattr(user, 'permission_group', perm_group) # Allows reserialization
- print(template, context)
+ if 'claim_url' in context:
+ notification_type = NotificationType.Type.PROVIDER_CONFIRM_EMAIL_MODERATION
+ else:
+ notification_type = NotificationType.Type.PROVIDER_MODERATOR_ADDED
NotificationType.objects.get(
- name=template,
+ name=notification_type,
).emit(
user=user,
event_context=context,
diff --git a/api/providers/tasks.py b/api/providers/tasks.py
index b0a39c9c337..5891494cfb2 100644
--- a/api/providers/tasks.py
+++ b/api/providers/tasks.py
@@ -639,18 +639,6 @@ def bulk_upload_finish_job(upload, row_count, success_count, draft_errors, appro
approval_errors.sort()
if not dry_run:
upload.save()
- notification_type = None
- event_context = {
- 'initiator_fullname': initiator.fullname,
- 'auto_approval': auto_approval,
- 'count': row_count,
- 'pending_submissions_url': get_registration_provider_submissions_url(provider),
- 'draft_errors': draft_errors,
- 'approval_errors': approval_errors,
- 'successes': success_count,
- 'failures': len(draft_errors),
- 'osf_support_email': settings.OSF_SUPPORT_EMAIL,
- }
if upload.state == JobState.DONE_FULL:
notification_type = NotificationType.Type.USER_REGISTRATION_BULK_UPLOAD_SUCCESS_ALL
@@ -666,12 +654,22 @@ def bulk_upload_finish_job(upload, row_count, success_count, draft_errors, appro
name=notification_type,
).emit(
user=initiator,
- event_context=event_context,
+ event_context={
+ 'initiator_fullname': initiator.fullname,
+ 'auto_approval': auto_approval,
+ 'count': row_count,
+ 'pending_submissions_url': get_registration_provider_submissions_url(provider),
+ 'draft_errors': draft_errors,
+ 'approval_errors': approval_errors,
+ 'successes': success_count,
+ 'failures': len(draft_errors),
+ 'osf_support_email': settings.OSF_SUPPORT_EMAIL,
+ },
)
upload.email_sent = timezone.now()
upload.save()
- logger.info(f'Notification sent to bulk upload initiator [{initiator._id}]')
+ logger.info(f'Email sent to bulk upload initiator [{initiator._id}]')
def handle_internal_error(initiator=None, provider=None, message=None, dry_run=True):
diff --git a/api/users/views.py b/api/users/views.py
index df2d2a215e6..8fabe5b54ee 100644
--- a/api/users/views.py
+++ b/api/users/views.py
@@ -825,29 +825,33 @@ def get(self, request, *args, **kwargs):
if not email:
raise ValidationError('Request must include email in query params.')
- institutional = bool(request.query_params.get('institutional', None))
- mail_template = 'forgot_password' if not institutional else 'forgot_password_institution'
-
status_message = language.RESET_PASSWORD_SUCCESS_STATUS_MESSAGE.format(email=email)
- kind = 'success'
# check if the user exists
user_obj = get_user(email=email)
+ institutional = bool(request.query_params.get('institutional', None))
if user_obj:
# rate limit forgot_password_post
if not throttle_period_expired(user_obj.email_last_sent, settings.SEND_EMAIL_THROTTLE):
- status_message = 'You have recently requested to change your password. Please wait a few minutes ' \
- 'before trying again.'
- kind = 'error'
- return Response({'message': status_message, 'kind': kind}, status=status.HTTP_429_TOO_MANY_REQUESTS)
+ return Response(
+ {
+ 'message': language.THROTTLE_PASSWORD_CHANGE_ERROR_MESSAGE,
+ 'kind': 'error',
+ },
+ status=status.HTTP_429_TOO_MANY_REQUESTS,
+ )
elif user_obj.is_active:
# new random verification key (v2)
user_obj.verification_key_v2 = generate_verification_key(verification_type='password')
user_obj.email_last_sent = timezone.now()
user_obj.save()
reset_link = f'{settings.RESET_PASSWORD_URL}{user_obj._id}/{user_obj.verification_key_v2['token']}/'
+ if institutional:
+ notification_type = NotificationType.Type.USER_FORGOT_PASSWORD_INSTITUTION
+ else:
+ notification_type = NotificationType.Type.USER_FORGOT_PASSWORD
- NotificationType.objects.get(name=mail_template).emit(
+ NotificationType.objects.get(name=notification_type).emit(
user=user_obj,
message_frequency='instantly',
event_context={
@@ -856,7 +860,14 @@ def get(self, request, *args, **kwargs):
},
)
- return Response(status=status.HTTP_200_OK, data={'message': status_message, 'kind': kind, 'institutional': institutional})
+ return Response(
+ status=status.HTTP_200_OK,
+ data={
+ 'message': status_message,
+ 'kind': 'success',
+ 'institutional': institutional,
+ },
+ )
@method_decorator(csrf_protect)
def post(self, request, *args, **kwargs):
diff --git a/tests/test_forgot_password.py b/tests/test_forgot_password.py
index 9ca6df4fdab..0a383d30fd9 100644
--- a/tests/test_forgot_password.py
+++ b/tests/test_forgot_password.py
@@ -168,7 +168,7 @@ def test_can_receive_reset_password_email(self):
# check mail was sent
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
+ assert notifications[0]['type'] == NotificationType.Type.USER_FORGOT_PASSWORD_INSTITUTION
# check http 200 response
assert res.status_code == 200
# check request URL is /forgotpassword
diff --git a/website/language.py b/website/language.py
index 80936924e6a..605773694d2 100644
--- a/website/language.py
+++ b/website/language.py
@@ -222,6 +222,9 @@
'you should have, please contact OSF Support. '
)
+THROTTLE_PASSWORD_CHANGE_ERROR_MESSAGE = \
+ 'You have recently requested to change your password. Please wait a few minutes before trying again.'
+
SANCTION_STATUS_MESSAGES = {
'registration': {
'approve': 'Your registration approval has been accepted.',
From d65288ecfa9b025fa62114d1cdf41f6a0d5f0118 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Sun, 27 Jul 2025 12:04:34 -0400
Subject: [PATCH 122/336] fix user claim and i forgot email notifications
---
api/users/views.py | 4 +-
api_tests/users/views/test_user_claim.py | 106 +++++++++++++----------
notifications.yaml | 4 +
tests/test_forgot_password.py | 2 +-
4 files changed, 65 insertions(+), 51 deletions(-)
diff --git a/api/users/views.py b/api/users/views.py
index 8fabe5b54ee..7cd8947a79a 100644
--- a/api/users/views.py
+++ b/api/users/views.py
@@ -1085,7 +1085,7 @@ def _process_external_identity(self, user, external_identity, service_url):
message_frequency='instantly',
event_context={
'can_change_preferences': False,
- 'external_id_provider': provider.name,
+ 'external_id_provider': provider,
},
)
enqueue_task(update_affiliation_for_orcid_sso_users.s(user._id, provider_id))
@@ -1408,7 +1408,7 @@ def post(self, request, *args, **kwargs):
message_frequency='instantly',
event_context={
'can_change_preferences': False,
- 'external_id_provider': provider.name,
+ 'external_id_provider': provider,
},
)
diff --git a/api_tests/users/views/test_user_claim.py b/api_tests/users/views/test_user_claim.py
index d5f5967df57..ddd7cfad4e5 100644
--- a/api_tests/users/views/test_user_claim.py
+++ b/api_tests/users/views/test_user_claim.py
@@ -5,14 +5,16 @@
from api.users.views import ClaimUser
from api_tests.utils import only_supports_methods
from framework.auth.core import Auth
+from osf.models import NotificationType
from osf_tests.factories import (
AuthUserFactory,
ProjectFactory,
PreprintFactory,
)
+from tests.utils import capture_notifications
+
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestClaimUser:
@pytest.fixture()
@@ -115,41 +117,47 @@ def test_claim_unauth_failure(self, app, url, unreg_user, project, wrong_preprin
)
assert res.status_code == 401
- def test_claim_unauth_success_with_original_email(self, app, url, project, unreg_user, mock_send_grid):
- mock_send_grid.reset_mock()
- res = app.post_json_api(
- url.format(unreg_user._id),
- self.payload(email='david@david.son', id=project._id),
- )
+ def test_claim_unauth_success_with_original_email(self, app, url, project, unreg_user):
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url.format(unreg_user._id),
+ self.payload(email='david@david.son', id=project._id),
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_INVITE_DEFAULT
assert res.status_code == 204
- assert mock_send_grid.call_count == 1
- def test_claim_unauth_success_with_claimer_email(self, app, url, unreg_user, project, claimer, mock_send_grid):
- mock_send_grid.reset_mock()
- res = app.post_json_api(
- url.format(unreg_user._id),
- self.payload(email=claimer.username, id=project._id)
- )
+ def test_claim_unauth_success_with_claimer_email(self, app, url, unreg_user, project, claimer):
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url.format(unreg_user._id),
+ self.payload(email=claimer.username, id=project._id)
+ )
assert res.status_code == 204
- assert mock_send_grid.call_count == 2
-
- def test_claim_unauth_success_with_unknown_email(self, app, url, project, unreg_user, mock_send_grid):
- mock_send_grid.reset_mock()
- res = app.post_json_api(
- url.format(unreg_user._id),
- self.payload(email='asdf@fdsa.com', id=project._id),
- )
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.USER_FORWARD_INVITE_REGISTERED
+ assert notifications[1]['type'] == NotificationType.Type.USER_PENDING_VERIFICATION_REGISTERED
+
+ def test_claim_unauth_success_with_unknown_email(self, app, url, project, unreg_user):
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url.format(unreg_user._id),
+ self.payload(email='asdf@fdsa.com', id=project._id),
+ )
assert res.status_code == 204
- assert mock_send_grid.call_count == 2
-
- def test_claim_unauth_success_with_preprint_id(self, app, url, preprint, unreg_user, mock_send_grid):
- mock_send_grid.reset_mock()
- res = app.post_json_api(
- url.format(unreg_user._id),
- self.payload(email='david@david.son', id=preprint._id),
- )
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.USER_PENDING_VERIFICATION
+ assert notifications[1]['type'] == NotificationType.Type.USER_FORWARD_INVITE
+
+ def test_claim_unauth_success_with_preprint_id(self, app, url, preprint, unreg_user):
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url.format(unreg_user._id),
+ self.payload(email='david@david.son', id=preprint._id),
+ )
assert res.status_code == 204
- assert mock_send_grid.call_count == 1
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_INVITE_DEFAULT
def test_claim_auth_failure(self, app, url, claimer, wrong_preprint, project, unreg_user, referrer):
_url = url.format(unreg_user._id)
@@ -208,26 +216,28 @@ def test_claim_auth_failure(self, app, url, claimer, wrong_preprint, project, un
)
assert res.status_code == 403
- def test_claim_auth_throttle_error(self, app, url, claimer, unreg_user, project, mock_send_grid):
+ def test_claim_auth_throttle_error(self, app, url, claimer, unreg_user, project):
unreg_user.unclaimed_records[project._id]['last_sent'] = timezone.now()
unreg_user.save()
- mock_send_grid.reset_mock()
- res = app.post_json_api(
- url.format(unreg_user._id),
- self.payload(id=project._id),
- auth=claimer.auth,
- expect_errors=True
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url.format(unreg_user._id),
+ self.payload(id=project._id),
+ auth=claimer.auth,
+ expect_errors=True
+ )
+ assert not notifications
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'User account can only be claimed with an existing user once every 24 hours'
- assert mock_send_grid.call_count == 0
- def test_claim_auth_success(self, app, url, claimer, unreg_user, project, mock_send_grid):
- mock_send_grid.reset_mock()
- res = app.post_json_api(
- url.format(unreg_user._id),
- self.payload(id=project._id),
- auth=claimer.auth
- )
+ def test_claim_auth_success(self, app, url, claimer, unreg_user, project):
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url.format(unreg_user._id),
+ self.payload(id=project._id),
+ auth=claimer.auth
+ )
assert res.status_code == 204
- assert mock_send_grid.call_count == 2
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.USER_FORWARD_INVITE_REGISTERED
+ assert notifications[1]['type'] == NotificationType.Type.USER_PENDING_VERIFICATION_REGISTERED
diff --git a/notifications.yaml b/notifications.yaml
index 5be39abc492..8b3e1fc7ea3 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -108,6 +108,10 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/duplicate_accounts_sso_osf4i.html.mako'
+ - name: user_forgot_password
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/forgot_password.html.mako'
- name: user_forgot_password_institution
__docs__: ...
object_content_type_model_name: osfuser
diff --git a/tests/test_forgot_password.py b/tests/test_forgot_password.py
index 0a383d30fd9..4d00f70d688 100644
--- a/tests/test_forgot_password.py
+++ b/tests/test_forgot_password.py
@@ -50,7 +50,7 @@ def test_can_receive_reset_password_email(self):
res = form.submit(self.app)
# check mail was sent
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
+ assert notifications[0]['type'] == NotificationType.Type.USER_FORGOT_PASSWORD
# check http 200 response
assert res.status_code == 200
# check request URL is /forgotpassword
From 9501714f2230fe23ddb19a89fea0aa9db7d33ee2 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Sun, 27 Jul 2025 16:22:21 -0400
Subject: [PATCH 123/336] remove old notification mocking and simplify celery
config
---
addons/boa/tasks.py | 239 ++++++++++++-----
addons/boa/tests/test_tasks.py | 75 +++---
.../views/test_crossref_email_response.py | 46 ++--
...est_draft_registration_contributor_list.py | 11 +-
.../views/test_draft_registration_list.py | 13 +-
.../test_institution_relationship_nodes.py | 1 -
api_tests/mailhog/test_mailhog.py | 8 +-
.../views/test_node_contributors_list.py | 28 +-
api_tests/nodes/views/test_node_forks_list.py | 23 +-
.../test_node_relationship_institutions.py | 17 +-
.../views/test_preprint_contributors_list.py | 55 ++--
...est_collections_provider_moderator_list.py | 41 +--
.../test_preprint_provider_moderator_list.py | 21 +-
api_tests/providers/tasks/test_bulk_upload.py | 1 -
.../views/test_registration_detail.py | 9 +-
.../test_node_request_institutional_access.py | 61 ++---
.../requests/views/test_node_request_list.py | 1 -
.../views/test_preprint_request_list.py | 7 -
.../views/test_request_actions_create.py | 36 +--
api_tests/users/views/test_user_confirm.py | 99 ++++---
api_tests/users/views/test_user_list.py | 129 ++++-----
.../test_user_message_institutional_access.py | 81 +++---
api_tests/users/views/test_user_settings.py | 22 +-
.../users/views/test_user_settings_detail.py | 9 +-
.../test_user_settings_reset_password.py | 22 +-
conftest.py | 33 ---
docker-compose.yml | 4 +-
framework/email/__init__.py | 0
framework/email/tasks.py | 227 ----------------
notifications.yaml | 10 +-
osf/email/__init__.py | 40 ++-
osf/models/notification.py | 19 +-
osf/models/notification_type.py | 12 +-
osf/models/user_message.py | 18 +-
.../test_check_crossref_dois.py | 13 +-
.../test_email_all_users.py | 34 ++-
osf_tests/test_archiver.py | 107 ++++----
osf_tests/test_collection.py | 21 +-
osf_tests/test_collection_submission.py | 2 -
osf_tests/test_institution.py | 21 +-
osf_tests/test_merging_users.py | 12 +-
osf_tests/test_queued_mail.py | 155 -----------
osf_tests/test_sanctions.py | 2 -
osf_tests/test_schema_responses.py | 123 +++++----
osf_tests/test_user.py | 32 ++-
.../test_deactivate_requested_accounts.py | 19 +-
scripts/tests/test_send_queued_mails.py | 84 ------
tests/base.py | 3 -
tests/framework_tests/test_email.py | 108 --------
tests/test_auth.py | 44 ++--
tests/test_auth_views.py | 4 -
tests/test_misc_views.py | 23 +-
tests/test_preprints.py | 8 +-
tests/test_registrations/test_embargoes.py | 11 +-
tests/test_registrations/test_retractions.py | 36 +--
tests/test_spam_mixin.py | 20 +-
tests/test_user_profile_view.py | 3 -
website/mails/mails.py | 21 +-
website/notifications/tasks.py | 227 ----------------
website/settings/defaults.py | 245 +++++++-----------
website/settings/local-ci.py | 1 -
website/settings/local-dist.py | 1 -
62 files changed, 1027 insertions(+), 1771 deletions(-)
delete mode 100644 framework/email/__init__.py
delete mode 100644 framework/email/tasks.py
delete mode 100644 osf_tests/test_queued_mail.py
delete mode 100644 scripts/tests/test_send_queued_mails.py
delete mode 100644 tests/framework_tests/test_email.py
delete mode 100644 website/notifications/tasks.py
diff --git a/addons/boa/tasks.py b/addons/boa/tasks.py
index a64110e69b5..4b8753e5b39 100644
--- a/addons/boa/tasks.py
+++ b/addons/boa/tasks.py
@@ -1,9 +1,7 @@
-import asyncio
from http.client import HTTPException
import logging
import time
-from asgiref.sync import async_to_sync, sync_to_async
from boaapi.boa_client import BoaClient, BoaException
from boaapi.status import CompilerStatus, ExecutionStatus
from urllib import request
@@ -14,10 +12,9 @@
from addons.boa.boa_error_code import BoaErrorCode
from framework import sentry
from framework.celery_tasks import app as celery_app
-from osf.models import OSFUser
+from osf.models import OSFUser, NotificationType
from osf.utils.fields import ensure_str, ensure_bytes
from website import settings as osf_settings
-from website.mails import send_mail, ADDONS_BOA_JOB_COMPLETE, ADDONS_BOA_JOB_FAILURE
logger = logging.getLogger(__name__)
@@ -38,14 +35,34 @@ def submit_to_boa(host, username, password, user_guid, project_guid,
* Running asyncio in celery is tricky. Refer to the discussion below for details:
* https://stackoverflow.com/questions/39815771/how-to-combine-celery-with-asyncio
"""
- return async_to_sync(submit_to_boa_async)(host, username, password, user_guid, project_guid,
- query_dataset, query_file_name, file_size, file_full_path,
- query_download_url, output_upload_url)
+ return _submit_to_boa(
+ host,
+ username,
+ password,
+ user_guid,
+ project_guid,
+ query_dataset,
+ query_file_name,
+ file_size,
+ file_full_path,
+ query_download_url,
+ output_upload_url
+ )
-async def submit_to_boa_async(host, username, password, user_guid, project_guid,
- query_dataset, query_file_name, file_size, file_full_path,
- query_download_url, output_upload_url):
+def _submit_to_boa(
+ host,
+ username,
+ password,
+ user_guid,
+ project_guid,
+ query_dataset,
+ query_file_name,
+ file_size,
+ file_full_path,
+ query_download_url,
+ output_upload_url
+):
"""
Download Boa query file, submit it to Boa API, wait for Boa to finish the job
and upload result output to OSF. Send success / failure emails notifications.
@@ -55,19 +72,27 @@ async def submit_to_boa_async(host, username, password, user_guid, project_guid,
* See notes in ``submit_to_boa()`` for details.
"""
- logger.debug('>>>>>>>> Task begins')
- user = await sync_to_async(OSFUser.objects.get)(guids___id=user_guid)
- cookie_value = (await sync_to_async(user.get_or_create_cookie)()).decode()
+ user = OSFUser.objects.get(guids___id=user_guid)
+ cookie_value = user.get_or_create_cookie().decode()
project_url = f'{osf_settings.DOMAIN}{project_guid}/'
- output_file_name = query_file_name.replace('.boa', boa_settings.OUTPUT_FILE_SUFFIX)
+ output_file_name = query_file_name.replace(
+ '.boa',
+ boa_settings.OUTPUT_FILE_SUFFIX
+ )
if file_size > boa_settings.MAX_SUBMISSION_SIZE:
message = f'Boa query file too large to submit: user=[{user_guid}], project=[{project_guid}], ' \
f'file_name=[{query_file_name}], file_size=[{file_size}], ' \
f'full_path=[{file_full_path}], url=[{query_download_url}] ...'
- await sync_to_async(handle_boa_error)(message, BoaErrorCode.FILE_TOO_LARGE_ERROR,
- user.username, user.fullname, project_url, file_full_path,
- query_file_name=query_file_name, file_size=file_size)
+ handle_boa_error(
+ message,
+ BoaErrorCode.FILE_TOO_LARGE_ERROR,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name,
+ file_size=file_size
+ )
return BoaErrorCode.FILE_TOO_LARGE_ERROR
logger.debug(f'Downloading Boa query file: user=[{user_guid}], project=[{project_guid}], '
@@ -79,8 +104,14 @@ async def submit_to_boa_async(host, username, password, user_guid, project_guid,
except (ValueError, HTTPError, URLError, HTTPException):
message = f'Failed to download Boa query file: user=[{user_guid}], project=[{project_guid}], ' \
f'file_name=[{query_file_name}], full_path=[{file_full_path}], url=[{query_download_url}] ...'
- await sync_to_async(handle_boa_error)(message, BoaErrorCode.UNKNOWN, user.username, user.fullname,
- project_url, file_full_path, query_file_name=query_file_name)
+ handle_boa_error(
+ message,
+ BoaErrorCode.UNKNOWN,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name
+ )
return BoaErrorCode.UNKNOWN
logger.info('Boa query successfully downloaded.')
logger.debug(f'Boa query:\n########\n{boa_query}\n########')
@@ -93,8 +124,14 @@ async def submit_to_boa_async(host, username, password, user_guid, project_guid,
except BoaException:
# Don't call `client.close()`, since it will fail with `BoaException` if `client.login()` fails
message = f'Boa login failed: boa_username=[{username}], boa_host=[{host}]!'
- await sync_to_async(handle_boa_error)(message, BoaErrorCode.AUTHN_ERROR, user.username, user.fullname,
- project_url, file_full_path, query_file_name=query_file_name)
+ handle_boa_error(
+ message,
+ BoaErrorCode.AUTHN_ERROR,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name
+ )
return BoaErrorCode.AUTHN_ERROR
logger.info('Boa login completed.')
@@ -104,8 +141,14 @@ async def submit_to_boa_async(host, username, password, user_guid, project_guid,
except BoaException:
client.close()
message = f'Failed to retrieve or verify the target Boa dataset: dataset=[{query_dataset}]!'
- await sync_to_async(handle_boa_error)(message, BoaErrorCode.UNKNOWN, user.username, user.fullname,
- project_url, file_full_path, query_file_name=query_file_name)
+ handle_boa_error(
+ message,
+ BoaErrorCode.UNKNOWN,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name
+ )
return BoaErrorCode.UNKNOWN
logger.info('Boa dataset retrieved.')
@@ -116,8 +159,14 @@ async def submit_to_boa_async(host, username, password, user_guid, project_guid,
except BoaException:
client.close()
message = f'Failed to submit the query to Boa API: : boa_host=[{host}], dataset=[{query_dataset}]!'
- await sync_to_async(handle_boa_error)(message, BoaErrorCode.UNKNOWN, user.username, user.fullname,
- project_url, file_full_path, query_file_name=query_file_name)
+ handle_boa_error(
+ message,
+ BoaErrorCode.UNKNOWN,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name
+ )
return BoaErrorCode.UNKNOWN
logger.info('Query successfully submitted.')
logger.debug(f'Waiting for job to finish: job_id=[{str(boa_job.id)}] ...')
@@ -125,26 +174,44 @@ async def submit_to_boa_async(host, username, password, user_guid, project_guid,
if time.time() - start_time > boa_settings.MAX_JOB_WAITING_TIME:
client.close()
message = f'Boa job did not complete in time: job_id=[{str(boa_job.id)}]!'
- await sync_to_async(handle_boa_error)(message, BoaErrorCode.JOB_TIME_OUT_ERROR,
- user.username, user.fullname, project_url, file_full_path,
- query_file_name=query_file_name, job_id=boa_job.id)
+ handle_boa_error(
+ message,
+ BoaErrorCode.JOB_TIME_OUT_ERROR,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name,
+ job_id=boa_job.id
+ )
return BoaErrorCode.JOB_TIME_OUT_ERROR
logger.debug(f'Boa job still running, waiting 10s: job_id=[{str(boa_job.id)}] ...')
boa_job.refresh()
- await asyncio.sleep(boa_settings.REFRESH_JOB_INTERVAL)
+ time.sleep(boa_settings.REFRESH_JOB_INTERVAL)
if boa_job.compiler_status is CompilerStatus.ERROR:
client.close()
message = f'Boa job failed with compile error: job_id=[{str(boa_job.id)}]!'
- await sync_to_async(handle_boa_error)(message, BoaErrorCode.QUERY_ERROR, user.username,
- user.fullname, project_url, file_full_path,
- query_file_name=query_file_name, job_id=boa_job.id)
+ handle_boa_error(
+ message,
+ BoaErrorCode.QUERY_ERROR,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name,
+ job_id=boa_job.id
+ )
return BoaErrorCode.QUERY_ERROR
elif boa_job.exec_status is ExecutionStatus.ERROR:
client.close()
message = f'Boa job failed with execution error: job_id=[{str(boa_job.id)}]!'
- await sync_to_async(handle_boa_error)(message, BoaErrorCode.QUERY_ERROR, user.username,
- user.fullname, project_url, file_full_path,
- query_file_name=query_file_name, job_id=boa_job.id)
+ handle_boa_error(
+ message,
+ BoaErrorCode.QUERY_ERROR,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name,
+ job_id=boa_job.id
+ )
return BoaErrorCode.QUERY_ERROR
else:
try:
@@ -152,9 +219,15 @@ async def submit_to_boa_async(host, username, password, user_guid, project_guid,
except BoaException:
client.close()
message = f'Boa job output is not available: job_id=[{str(boa_job.id)}]!'
- await sync_to_async(handle_boa_error)(message, BoaErrorCode.OUTPUT_ERROR, user.username,
- user.fullname, project_url, file_full_path,
- query_file_name=query_file_name, job_id=boa_job.id)
+ handle_boa_error(
+ message,
+ BoaErrorCode.OUTPUT_ERROR,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name,
+ job_id=boa_job.id
+ )
return BoaErrorCode.OUTPUT_ERROR
logger.info('Boa job finished.')
logger.debug(f'Boa job output: job_id=[{str(boa_job.id)}]\n########\n{boa_job_output}\n########')
@@ -177,31 +250,50 @@ async def submit_to_boa_async(host, username, password, user_guid, project_guid,
message += f', http_error=[{e.code}: {e.reason}]'
if e.code == 409:
error_code = BoaErrorCode.UPLOAD_ERROR_CONFLICT
- await sync_to_async(handle_boa_error)(message, error_code, user.username, user.fullname, project_url,
- file_full_path, query_file_name=query_file_name,
- output_file_name=output_file_name, job_id=boa_job.id)
+ handle_boa_error(
+ message,
+ error_code,
+ user,
+ project_url,
+ file_full_path,
+ query_file_name=query_file_name,
+ output_file_name=output_file_name,
+ job_id=boa_job.id
+ )
return error_code
logger.info('Successfully uploaded query output to OSF.')
logger.debug('Task ends <<<<<<<<')
- await sync_to_async(send_mail)(
- to_addr=user.username,
- mail=ADDONS_BOA_JOB_COMPLETE,
- fullname=user.fullname,
- query_file_name=query_file_name,
- query_file_full_path=file_full_path,
- output_file_name=output_file_name,
- job_id=boa_job.id,
- project_url=project_url,
- boa_job_list_url=boa_settings.BOA_JOB_LIST_URL,
- boa_support_email=boa_settings.BOA_SUPPORT_EMAIL,
- osf_support_email=osf_settings.OSF_SUPPORT_EMAIL,
+ NotificationType.objects.get(
+ name=NotificationType.Type.ADDONS_BOA_JOB_COMPLETE
+ ).emit(
+ user=user,
+ event_context={
+ 'fullname': user.fullname,
+ 'query_file_name': query_file_name,
+ 'query_file_full_path': file_full_path,
+ 'output_file_name': output_file_name,
+ 'job_id': boa_job.id,
+ 'project_url': project_url,
+ 'boa_job_list_url': boa_settings.BOA_JOB_LIST_URL,
+ 'boa_support_email': boa_settings.BOA_SUPPORT_EMAIL,
+ 'osf_support_email': osf_settings.OSF_SUPPORT_EMAIL,
+ }
)
return BoaErrorCode.NO_ERROR
-def handle_boa_error(message, code, username, fullname, project_url, query_file_full_path,
- query_file_name=None, file_size=None, output_file_name=None, job_id=None):
+def handle_boa_error(
+ message,
+ code,
+ user,
+ project_url,
+ query_file_full_path,
+ query_file_name=None,
+ file_size=None,
+ output_file_name=None,
+ job_id=None
+):
"""Handle Boa and WB API errors and send emails.
"""
logger.error(message)
@@ -209,22 +301,25 @@ def handle_boa_error(message, code, username, fullname, project_url, query_file_
sentry.log_message(message, skip_session=True)
except Exception:
pass
- send_mail(
- to_addr=username,
- mail=ADDONS_BOA_JOB_FAILURE,
- fullname=fullname,
- code=code,
- message=message,
- query_file_name=query_file_name,
- file_size=file_size,
- max_file_size=boa_settings.MAX_SUBMISSION_SIZE,
- query_file_full_path=query_file_full_path,
- output_file_name=output_file_name,
- job_id=job_id,
- max_job_wait_hours=boa_settings.MAX_JOB_WAITING_TIME / 3600,
- project_url=project_url,
- boa_job_list_url=boa_settings.BOA_JOB_LIST_URL,
- boa_support_email=boa_settings.BOA_SUPPORT_EMAIL,
- osf_support_email=osf_settings.OSF_SUPPORT_EMAIL,
+ NotificationType.objects.get(
+ name=NotificationType.Type.ADDONS_BOA_JOB_FAILURE
+ ).emit(
+ user=user,
+ event_context={
+ 'fullname': user.fullname,
+ 'code': code,
+ 'query_file_name': query_file_name,
+ 'file_size': file_size,
+ 'max_file_size': boa_settings.MAX_SUBMISSION_SIZE,
+ 'query_file_full_path': query_file_full_path,
+ 'output_file_name': output_file_name,
+ 'job_id': job_id,
+ 'max_job_wait_hours': boa_settings.MAX_JOB_WAITING_TIME / 3600,
+ 'project_url': project_url,
+ 'boa_job_list_url': boa_settings.BOA_JOB_LIST_URL,
+ 'boa_support_email': boa_settings.BOA_SUPPORT_EMAIL,
+ 'osf_support_email': osf_settings.OSF_SUPPORT_EMAIL,
+
+ }
)
return code
diff --git a/addons/boa/tests/test_tasks.py b/addons/boa/tests/test_tasks.py
index b2dcd6d86bc..f31185fa789 100644
--- a/addons/boa/tests/test_tasks.py
+++ b/addons/boa/tests/test_tasks.py
@@ -9,10 +9,11 @@
from addons.boa import settings as boa_settings
from addons.boa.boa_error_code import BoaErrorCode
from addons.boa.tasks import submit_to_boa, submit_to_boa_async, handle_boa_error
+from osf.models import NotificationType
from osf_tests.factories import AuthUserFactory, ProjectFactory
from tests.base import OsfTestCase
+from tests.utils import capture_notifications
from website import settings as osf_settings
-from website.mails import ADDONS_BOA_JOB_COMPLETE, ADDONS_BOA_JOB_FAILURE
DEFAULT_REFRESH_JOB_INTERVAL = boa_settings.REFRESH_JOB_INTERVAL
DEFAULT_MAX_JOB_WAITING_TIME = boa_settings.MAX_JOB_WAITING_TIME
@@ -38,12 +39,6 @@ def setUp(self):
self.output_file_name = 'fake_boa_script_results.txt'
self.job_id = '1a2b3c4d5e6f7g8'
- from conftest import start_mock_send_grid
- self.mock_send_grid = start_mock_send_grid(self)
-
- def tearDown(self):
- super().tearDown()
-
def test_boa_error_code(self):
assert BoaErrorCode.NO_ERROR == -1
assert BoaErrorCode.UNKNOWN == 0
@@ -55,24 +50,25 @@ def test_boa_error_code(self):
assert BoaErrorCode.FILE_TOO_LARGE_ERROR == 6
assert BoaErrorCode.JOB_TIME_OUT_ERROR == 7
- @mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
def test_handle_boa_error(self):
with mock.patch('addons.boa.tasks.sentry.log_message', return_value=None) as mock_sentry_log_message, \
mock.patch('addons.boa.tasks.logger.error', return_value=None) as mock_logger_error:
- return_value = handle_boa_error(
- self.error_message,
- BoaErrorCode.UNKNOWN,
- self.user_username,
- self.user_fullname,
- self.project_url,
- self.file_full_path,
- query_file_name=self.query_file_name,
- file_size=self.file_size,
- output_file_name=self.output_file_name,
- job_id=self.job_id
- )
- self.mock_send_grid.assert_called()
+ with capture_notifications() as notifications:
+ return_value = handle_boa_error(
+ self.error_message,
+ BoaErrorCode.UNKNOWN,
+ self.user_username,
+ self.user_fullname,
+ self.project_url,
+ self.file_full_path,
+ query_file_name=self.query_file_name,
+ file_size=self.file_size,
+ output_file_name=self.output_file_name,
+ job_id=self.job_id
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['typr'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
mock_sentry_log_message.assert_called_with(self.error_message, skip_session=True)
mock_logger_error.assert_called_with(self.error_message)
assert return_value == BoaErrorCode.UNKNOWN
@@ -154,13 +150,6 @@ def setUp(self):
boa_settings.REFRESH_JOB_INTERVAL = DEFAULT_REFRESH_JOB_INTERVAL
boa_settings.MAX_JOB_WAITING_TIME = DEFAULT_MAX_JOB_WAITING_TIME
- from conftest import start_mock_send_grid
- self.mock_send_grid = start_mock_send_grid(self)
-
- def tearDown(self):
- super().tearDown()
-
- @mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
async def test_submit_success(self):
with mock.patch('osf.models.user.OSFUser.objects.get', return_value=self.user), \
@@ -172,25 +161,27 @@ async def test_submit_success(self):
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('asyncio.sleep', new_callable=AsyncMock, return_value=None) as mock_async_sleep, \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
- self.host,
- self.username,
- self.password,
- self.user_guid,
- self.project_guid,
- self.query_dataset,
- self.query_file_name,
- self.file_size,
- self.file_full_path,
- self.query_download_url,
- self.output_upload_url,
- )
+ with capture_notifications() as notifications:
+ return_value = await submit_to_boa_async(
+ self.host,
+ self.username,
+ self.password,
+ self.user_guid,
+ self.project_guid,
+ self.query_dataset,
+ self.query_file_name,
+ self.file_size,
+ self.file_full_path,
+ self.query_download_url,
+ self.output_upload_url,
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert return_value == BoaErrorCode.NO_ERROR
assert self.mock_job.is_running.call_count == 5
assert self.mock_job.refresh.call_count == 4
assert mock_async_sleep.call_count == 4
mock_close.assert_called()
- self.mock_send_grid.assert_called()
mock_handle_boa_error.assert_not_called()
async def test_download_error(self):
diff --git a/api_tests/crossref/views/test_crossref_email_response.py b/api_tests/crossref/views/test_crossref_email_response.py
index 775a0045c06..e2a2b705362 100644
--- a/api_tests/crossref/views/test_crossref_email_response.py
+++ b/api_tests/crossref/views/test_crossref_email_response.py
@@ -5,12 +5,13 @@
from django.utils import timezone
+from osf.models import NotificationType
from osf_tests import factories
+from tests.utils import capture_notifications
from website import settings
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestCrossRefEmailResponse:
def make_mailgun_payload(self, crossref_response):
@@ -155,39 +156,40 @@ def test_wrong_request_context_raises_permission_error(self, app, url, error_xml
assert response.status_code == 400
- def test_error_response_sends_message_does_not_set_doi(self, app, url, preprint, error_xml, mock_send_grid):
+ def test_error_response_sends_message_does_not_set_doi(self, app, url, preprint, error_xml):
assert not preprint.get_identifier_value('doi')
context_data = self.make_mailgun_payload(crossref_response=error_xml)
- app.post(url, context_data)
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ app.post(url, context_data)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert not preprint.get_identifier_value('doi')
- def test_success_response_sets_doi(self, app, url, preprint, success_xml, mock_send_grid):
+ def test_success_response_sets_doi(self, app, url, preprint, success_xml):
assert not preprint.get_identifier_value('doi')
context_data = self.make_mailgun_payload(crossref_response=success_xml)
- mock_send_grid.reset_mock()
- app.post(url, context_data)
+ with capture_notifications() as notifications:
+ app.post(url, context_data)
+ assert not notifications
preprint.reload()
- assert not mock_send_grid.called
assert preprint.get_identifier_value('doi')
assert preprint.preprint_doi_created
- def test_update_success_response(self, app, preprint, url, mock_send_grid):
+ def test_update_success_response(self, app, preprint, url):
initial_value = 'TempDOIValue'
preprint.set_identifier_value(category='doi', value=initial_value)
update_xml = self.update_success_xml(preprint)
context_data = self.make_mailgun_payload(crossref_response=update_xml)
- mock_send_grid.reset_mock()
- app.post(url, context_data)
-
- assert not mock_send_grid.called
+ with capture_notifications() as notifications:
+ app.post(url, context_data)
+ assert not notifications
assert preprint.get_identifier_value(category='doi') != initial_value
- def test_update_success_does_not_set_preprint_doi_created(self, app, preprint, url, mock_send_grid):
+ def test_update_success_does_not_set_preprint_doi_created(self, app, preprint, url):
preprint.set_identifier_value(category='doi', value='test')
preprint.preprint_doi_created = timezone.now()
preprint.save()
@@ -212,14 +214,14 @@ def test_success_batch_response(self, app, url):
for preprint in preprint_list:
assert preprint.get_identifier_value('doi') == settings.DOI_FORMAT.format(prefix=provider.doi_prefix, guid=preprint._id)
- def test_confirmation_marks_legacy_doi_as_deleted(self, app, url, preprint, mock_send_grid):
- legacy_value = 'IAmALegacyDOI'
- preprint.set_identifier_value(category='legacy_doi', value=legacy_value)
- update_xml = self.update_success_xml(preprint)
+ def test_confirmation_marks_legacy_doi_as_deleted(self, app, url, preprint):
+ with capture_notifications() as notifications:
+ legacy_value = 'IAmALegacyDOI'
+ preprint.set_identifier_value(category='legacy_doi', value=legacy_value)
+ update_xml = self.update_success_xml(preprint)
- context_data = self.make_mailgun_payload(crossref_response=update_xml)
- mock_send_grid.reset_mock()
- app.post(url, context_data)
+ context_data = self.make_mailgun_payload(crossref_response=update_xml)
+ app.post(url, context_data)
- assert not mock_send_grid.called
+ assert not notifications
assert preprint.identifiers.get(category='legacy_doi').deleted
diff --git a/api_tests/draft_registrations/views/test_draft_registration_contributor_list.py b/api_tests/draft_registrations/views/test_draft_registration_contributor_list.py
index bf4d211a8d7..090993add28 100644
--- a/api_tests/draft_registrations/views/test_draft_registration_contributor_list.py
+++ b/api_tests/draft_registrations/views/test_draft_registration_contributor_list.py
@@ -265,8 +265,7 @@ def test_add_contributor_signal_if_default(
assert res.json['errors'][0]['detail'] == 'default is not a valid email preference.'
# Overrides TestNodeContributorCreateEmail
- def test_add_unregistered_contributor_sends_email(
- self, mock_send_grid, app, user, url_project_contribs):
+ def test_add_unregistered_contributor_sends_email(self, app, user, url_project_contribs):
with capture_notifications() as notifications:
res = app.post_json_api(
f'{url_project_contribs}?send_email=draft_registration',
@@ -305,8 +304,7 @@ def test_add_unregistered_contributor_signal_if_default(self, app, user, url_pro
assert notifications[0]['type'] == NotificationType.Type.USER_CONTRIBUTOR_ADDED_DRAFT_REGISTRATION
# Overrides TestNodeContributorCreateEmail
- def test_add_unregistered_contributor_without_email_no_email(
- self, mock_send_grid, app, user, url_project_contribs):
+ def test_add_unregistered_contributor_without_email_no_email(self, app, user, url_project_contribs):
url = f'{url_project_contribs}?send_email=draft_registration'
payload = {
'data': {
@@ -318,10 +316,11 @@ def test_add_unregistered_contributor_without_email_no_email(
}
with capture_signals() as mock_signal:
- res = app.post_json_api(url, payload, auth=user.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=user.auth)
+ assert not notifications
assert contributor_added in mock_signal.signals_sent()
assert res.status_code == 201
- assert mock_send_grid.call_count == 0
class TestDraftContributorBulkCreate(DraftRegistrationCRUDTestCase, TestNodeContributorBulkCreate):
diff --git a/api_tests/draft_registrations/views/test_draft_registration_list.py b/api_tests/draft_registrations/views/test_draft_registration_list.py
index 85842f1e0a6..b90493825ee 100644
--- a/api_tests/draft_registrations/views/test_draft_registration_list.py
+++ b/api_tests/draft_registrations/views/test_draft_registration_list.py
@@ -158,7 +158,6 @@ def test_draft_with_deleted_registered_node_shows_up_in_draft_list(
assert data[0]['attributes']['registration_metadata'] == {}
-@pytest.mark.usefixtures('mock_send_grid')
class TestDraftRegistrationCreateWithNode(AbstractDraftRegistrationTestCase):
@pytest.fixture()
@@ -337,11 +336,10 @@ def test_logged_in_non_contributor_cannot_create_draft(
)
assert res.status_code == 403
- def test_create_project_based_draft_does_not_email_initiator(self, app, user, url_draft_registrations, payload, mock_send_grid):
- mock_send_grid.reset_mock()
- app.post_json_api(f'{url_draft_registrations}?embed=branched_from&embed=initiator', payload, auth=user.auth)
-
- assert not mock_send_grid.called
+ def test_create_project_based_draft_does_not_email_initiator(self, app, user, url_draft_registrations, payload):
+ with capture_notifications() as notifications:
+ app.post_json_api(f'{url_draft_registrations}?embed=branched_from&embed=initiator', payload, auth=user.auth)
+ assert not notifications
def test_affiliated_institutions_are_copied_from_node_no_institutions(self, app, user, url_draft_registrations, payload):
"""
@@ -403,7 +401,6 @@ def test_affiliated_institutions_are_copied_from_user(self, app, user, url_draft
assert list(draft_registration.affiliated_institutions.all()) == list(user.get_affiliated_institutions())
-@pytest.mark.usefixtures('mock_send_grid')
class TestDraftRegistrationCreateWithoutNode(AbstractDraftRegistrationTestCase):
@pytest.fixture()
def url_draft_registrations(self):
@@ -430,7 +427,7 @@ def test_admin_can_create_draft(
assert draft.creator == user
assert draft.has_permission(user, ADMIN) is True
- def test_create_no_project_draft_emails_initiator(self, app, user, url_draft_registrations, payload, mock_send_grid):
+ def test_create_no_project_draft_emails_initiator(self, app, user, url_draft_registrations, payload):
# Intercepting the send_mail call from website.project.views.contributor.notify_added_contributor
with capture_notifications() as notifications:
app.post_json_api(
diff --git a/api_tests/institutions/views/test_institution_relationship_nodes.py b/api_tests/institutions/views/test_institution_relationship_nodes.py
index c025407ab78..5acf8a39fd5 100644
--- a/api_tests/institutions/views/test_institution_relationship_nodes.py
+++ b/api_tests/institutions/views/test_institution_relationship_nodes.py
@@ -27,7 +27,6 @@ def make_registration_payload(*node_ids):
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestInstitutionRelationshipNodes:
@pytest.fixture()
diff --git a/api_tests/mailhog/test_mailhog.py b/api_tests/mailhog/test_mailhog.py
index b911eea9b5c..997947f9588 100644
--- a/api_tests/mailhog/test_mailhog.py
+++ b/api_tests/mailhog/test_mailhog.py
@@ -1,6 +1,7 @@
import requests
import pytest
-from website.mails import send_mail, TEST
+from django.core.mail import send_mail
+from website.mails import TEST
from waffle.testutils import override_switch
from osf import features
from website import settings
@@ -22,10 +23,9 @@
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestMailHog:
- def test_mailhog_received_mail(self, mock_send_grid):
+ def test_mailhog_received_mail(self):
with override_switch(features.ENABLE_MAILHOG, active=True):
mailhog_v1 = f'{settings.MAILHOG_API_HOST}/api/v1/messages'
mailhog_v2 = f'{settings.MAILHOG_API_HOST}/api/v2/messages'
@@ -36,12 +36,10 @@ def test_mailhog_received_mail(self, mock_send_grid):
assert res['count'] == 1
assert res['items'][0]['Content']['Headers']['To'][0] == 'to_addr@mail.com'
assert res['items'][0]['Content']['Headers']['Subject'][0] == 'A test email to Mailhog'
- mock_send_grid.assert_called()
requests.delete(mailhog_v1)
@pytest.mark.django_db
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.ENABLE_TEST_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class TestAuthMailhog(OsfTestCase):
diff --git a/api_tests/nodes/views/test_node_contributors_list.py b/api_tests/nodes/views/test_node_contributors_list.py
index c4c7d63c7f5..4d29857676d 100644
--- a/api_tests/nodes/views/test_node_contributors_list.py
+++ b/api_tests/nodes/views/test_node_contributors_list.py
@@ -1209,7 +1209,6 @@ def test_add_contributor_validation(
@pytest.mark.django_db
@pytest.mark.enable_bookmark_creation
@pytest.mark.enable_enqueue_task
-@pytest.mark.usefixtures('mock_send_grid')
class TestNodeContributorCreateEmail(NodeCRUDTestCase):
@pytest.fixture()
@@ -1217,7 +1216,7 @@ def url_project_contribs(self, project_public):
return f'/{API_BASE}nodes/{project_public._id}/contributors/'
def test_add_contributor_no_email_if_false(
- self, mock_send_grid, app, user, url_project_contribs
+ self, app, user, url_project_contribs
):
url = f'{url_project_contribs}?send_email=false'
payload = {
@@ -1226,12 +1225,13 @@ def test_add_contributor_no_email_if_false(
'attributes': {'full_name': 'Kanye West', 'email': 'kanye@west.com'},
}
}
- res = app.post_json_api(url, payload, auth=user.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=user.auth)
+ assert not notifications
assert res.status_code == 201
- assert mock_send_grid.call_count == 0
def test_add_contributor_sends_email(
- self, mock_send_grid, app, user, user_two, url_project_contribs
+ self, app, user, user_two, url_project_contribs
):
with capture_notifications() as notifications:
res = app.post_json_api(
@@ -1290,7 +1290,7 @@ def test_add_contributor_signal_preprint_email_disallowed(
)
def test_add_unregistered_contributor_sends_email(
- self, mock_send_grid, app, user, url_project_contribs
+ self, app, user, url_project_contribs
):
with capture_notifications() as notifications:
res = app.post_json_api(
@@ -1347,7 +1347,7 @@ def test_add_unregistered_contributor_signal_preprint_email_disallowed(
)
def test_add_contributor_invalid_send_email_param(
- self, mock_send_grid, app, user, url_project_contribs
+ self, app, user, url_project_contribs
):
url = f'{url_project_contribs}?send_email=true'
payload = {
@@ -1356,16 +1356,15 @@ def test_add_contributor_invalid_send_email_param(
'attributes': {'full_name': 'Kanye West', 'email': 'kanye@west.com'},
}
}
- res = app.post_json_api(url, payload, auth=user.auth, expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=user.auth, expect_errors=True)
+ assert not notifications
assert res.status_code == 400
assert (
res.json['errors'][0]['detail'] == 'true is not a valid email preference.'
)
- assert mock_send_grid.call_count == 0
- def test_add_unregistered_contributor_without_email_no_email(
- self, mock_send_grid, app, user, url_project_contribs
- ):
+ def test_add_unregistered_contributor_without_email_no_email(self, app, user, url_project_contribs):
url = f'{url_project_contribs}?send_email=default'
payload = {
'data': {
@@ -1377,10 +1376,11 @@ def test_add_unregistered_contributor_without_email_no_email(
}
with capture_signals() as mock_signal:
- res = app.post_json_api(url, payload, auth=user.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=user.auth)
+ assert not notifications
assert contributor_added in mock_signal.signals_sent()
assert res.status_code == 201
- assert mock_send_grid.call_count == 0
@pytest.mark.django_db
diff --git a/api_tests/nodes/views/test_node_forks_list.py b/api_tests/nodes/views/test_node_forks_list.py
index 632c178bb2e..a9031b105e8 100644
--- a/api_tests/nodes/views/test_node_forks_list.py
+++ b/api_tests/nodes/views/test_node_forks_list.py
@@ -205,7 +205,6 @@ def test_forks_list_does_not_show_registrations_of_forks(
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestNodeForkCreate:
@pytest.fixture()
@@ -419,9 +418,7 @@ def test_read_only_contributor_can_fork_private_registration(
assert res.json['data']['attributes']['title'] == 'Fork of ' + \
private_project.title
- def test_send_email_success(
- self, app, user, public_project_url,
- fork_data_with_title, public_project, mock_send_grid):
+ def test_send_email_success(self, app, user, public_project_url, fork_data_with_title, public_project):
with capture_notifications() as notifications:
res = app.post_json_api(
@@ -437,13 +434,15 @@ def test_send_email_success(
assert notifications[0]['type'] == NotificationType.Type.NODE_FORK_COMPLETED
def test_send_email_failed(
- self, app, user, public_project_url,
- fork_data_with_title, public_project, mock_send_grid):
+ self, app, user, public_project_url, fork_data_with_title, public_project):
with mock.patch.object(NodeForksSerializer, 'save', side_effect=Exception()):
- with pytest.raises(Exception):
- app.post_json_api(
- public_project_url,
- fork_data_with_title,
- auth=user.auth)
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ with pytest.raises(Exception):
+ app.post_json_api(
+ public_project_url,
+ fork_data_with_title,
+ auth=user.auth
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_FORK_FAILED
diff --git a/api_tests/nodes/views/test_node_relationship_institutions.py b/api_tests/nodes/views/test_node_relationship_institutions.py
index c19c4e79d4b..179f357b987 100644
--- a/api_tests/nodes/views/test_node_relationship_institutions.py
+++ b/api_tests/nodes/views/test_node_relationship_institutions.py
@@ -115,7 +115,6 @@ def create_payload(self, institutions):
]
}
-@pytest.mark.usefixtures('mock_send_grid')
class TestNodeRelationshipInstitutions(RelationshipInstitutionsTestMixin):
def test_node_with_no_permissions(self, app, unauthorized_user_with_affiliation, institution_one, node_institutions_url):
@@ -254,18 +253,18 @@ def test_remove_institutions_with_affiliated_user(
assert res.status_code == 200
assert node.affiliated_institutions.count() == 0
- def test_using_post_making_no_changes_returns_201(self, app, user, institution_one, node, node_institutions_url, mock_send_grid):
+ def test_using_post_making_no_changes_returns_201(self, app, user, institution_one, node, node_institutions_url):
node.affiliated_institutions.add(institution_one)
node.save()
assert institution_one in node.affiliated_institutions.all()
- mock_send_grid.reset_mock()
- res = app.post_json_api(
- node_institutions_url,
- self.create_payload([institution_one]),
- auth=user.auth
- )
- mock_send_grid.assert_not_called()
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ node_institutions_url,
+ self.create_payload([institution_one]),
+ auth=user.auth
+ )
+ assert not notifications
assert res.status_code == 201
assert institution_one in node.affiliated_institutions.all()
diff --git a/api_tests/preprints/views/test_preprint_contributors_list.py b/api_tests/preprints/views/test_preprint_contributors_list.py
index ce96d8d308c..a719589563c 100644
--- a/api_tests/preprints/views/test_preprint_contributors_list.py
+++ b/api_tests/preprints/views/test_preprint_contributors_list.py
@@ -78,7 +78,6 @@ def contrib_id(preprint_id, user_id):
return f'{preprint_id}-{user_id}'
return contrib_id
-
@pytest.mark.django_db
@pytest.mark.enable_implicit_clean
class TestPreprintContributorList(NodeCRUDTestCase):
@@ -1352,7 +1351,6 @@ def test_add_contributor_validation(self, preprint_published, validate_data):
@pytest.mark.django_db
@pytest.mark.enable_enqueue_task
-@pytest.mark.usefixtures('mock_send_grid')
class TestPreprintContributorCreateEmail(NodeCRUDTestCase):
@pytest.fixture()
@@ -1360,7 +1358,7 @@ def url_preprint_contribs(self, preprint_published):
return f'/{API_BASE}preprints/{preprint_published._id}/contributors/'
def test_add_contributor_no_email_if_false(
- self, mock_send_grid, app, user, url_preprint_contribs):
+ self, app, user, url_preprint_contribs):
url = f'{url_preprint_contribs}?send_email=false'
payload = {
'data': {
@@ -1371,14 +1369,12 @@ def test_add_contributor_no_email_if_false(
}
}
}
- mock_send_grid.reset_mock()
- res = app.post_json_api(url, payload, auth=user.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=user.auth)
+ assert not notifications
assert res.status_code == 201
- assert mock_send_grid.call_count == 0
- def test_add_contributor_needs_preprint_filter_to_send_email(
- self, mock_send_grid, app, user, user_two,
- url_preprint_contribs):
+ def test_add_contributor_needs_preprint_filter_to_send_email(self, app, user, user_two, url_preprint_contribs):
url = f'{url_preprint_contribs}?send_email=default'
payload = {
'data': {
@@ -1395,12 +1391,11 @@ def test_add_contributor_needs_preprint_filter_to_send_email(
}
}
}
-
- mock_send_grid.reset_mock()
- res = app.post_json_api(url, payload, auth=user.auth, expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=user.auth, expect_errors=True)
+ assert not notifications
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'default is not a valid email preference.'
- assert mock_send_grid.call_count == 0
def test_add_contributor_signal_if_preprint(
self, app, user, user_two, url_preprint_contribs):
@@ -1467,8 +1462,7 @@ def test_add_unregistered_contributor_signal_if_preprint(self, app, user, url_pr
assert len(notifications) == 1
assert notifications[0]['type'] == NotificationType.Type.USER_CONTRIBUTOR_ADDED_OSF_PREPRINT
- def test_add_contributor_invalid_send_email_param(
- self, mock_send_grid, app, user, url_preprint_contribs):
+ def test_add_contributor_invalid_send_email_param(self, app, user, url_preprint_contribs):
url = f'{url_preprint_contribs}?send_email=true'
payload = {
'data': {
@@ -1479,16 +1473,19 @@ def test_add_contributor_invalid_send_email_param(
}
}
}
- mock_send_grid.reset_mock()
- res = app.post_json_api(
- url, payload, auth=user.auth,
- expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url,
+ payload,
+ auth=user.auth,
+ expect_errors=True
+ )
+ assert not notifications
assert res.status_code == 400
assert res.json['errors'][0]['detail'] == 'true is not a valid email preference.'
- assert mock_send_grid.call_count == 0
def test_add_unregistered_contributor_without_email_no_email(
- self, mock_send_grid, app, user, url_preprint_contribs):
+ self, app, user, url_preprint_contribs):
url = f'{url_preprint_contribs}?send_email=preprint'
payload = {
'data': {
@@ -1499,16 +1496,16 @@ def test_add_unregistered_contributor_without_email_no_email(
}
}
- mock_send_grid.reset_mock()
with capture_signals() as mock_signal:
- res = app.post_json_api(url, payload, auth=user.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=user.auth)
+ assert not notifications
assert contributor_added in mock_signal.signals_sent()
assert res.status_code == 201
- assert mock_send_grid.call_count == 0
@mock.patch('osf.models.preprint.update_or_enqueue_on_preprint_updated')
def test_publishing_preprint_sends_emails_to_contributors(
- self, mock_update, mock_send_grid, app, user, url_preprint_contribs, preprint_unpublished):
+ self, mock_update, app, user, url_preprint_contribs, preprint_unpublished):
url = f'/{API_BASE}preprints/{preprint_unpublished._id}/'
user_two = AuthUserFactory()
preprint_unpublished.add_contributor(user_two, permissions=permissions.WRITE, save=True)
@@ -1547,7 +1544,7 @@ def test_contributor_added_signal_not_specified(self, app, user, url_preprint_co
assert notifications[0]['type'] == NotificationType.Type.USER_CONTRIBUTOR_ADDED_OSF_PREPRINT
def test_contributor_added_not_sent_if_unpublished(
- self, mock_send_grid, app, user, preprint_unpublished):
+ self, app, user, preprint_unpublished):
url = f'/{API_BASE}preprints/{preprint_unpublished._id}/contributors/?send_email=preprint'
payload = {
'data': {
@@ -1558,10 +1555,10 @@ def test_contributor_added_not_sent_if_unpublished(
}
}
}
- mock_send_grid.reset_mock()
- res = app.post_json_api(url, payload, auth=user.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=user.auth)
+ assert not notifications
assert res.status_code == 201
- assert mock_send_grid.call_count == 0
@pytest.mark.django_db
diff --git a/api_tests/providers/collections/views/test_collections_provider_moderator_list.py b/api_tests/providers/collections/views/test_collections_provider_moderator_list.py
index 5a7275158f2..bf1efa42e2b 100644
--- a/api_tests/providers/collections/views/test_collections_provider_moderator_list.py
+++ b/api_tests/providers/collections/views/test_collections_provider_moderator_list.py
@@ -91,27 +91,27 @@ def test_GET_admin_with_filter(self, app, url, nonmoderator, moderator, admin, p
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestPOSTCollectionsModeratorList:
- def test_POST_unauthorized(self, mock_send_grid, app, url, nonmoderator, moderator, provider):
+ def test_POST_unauthorized(self, app, url, nonmoderator, moderator, provider):
payload = make_payload(user_id=nonmoderator._id, permission_group='moderator')
- res = app.post(url, payload, expect_errors=True)
+ with capture_notifications() as notification:
+ res = app.post(url, payload, expect_errors=True)
+ assert not notification
assert res.status_code == 401
- assert mock_send_grid.call_count == 0
- def test_POST_forbidden(self, mock_send_grid, app, url, nonmoderator, moderator, provider):
+ def test_POST_forbidden(self, app, url, nonmoderator, moderator, provider):
payload = make_payload(user_id=nonmoderator._id, permission_group='moderator')
- res = app.post(url, payload, auth=nonmoderator.auth, expect_errors=True)
- assert res.status_code == 403
-
- res = app.post(url, payload, auth=moderator.auth, expect_errors=True)
- assert res.status_code == 403
+ with capture_notifications() as notification:
+ res = app.post(url, payload, auth=nonmoderator.auth, expect_errors=True)
+ assert res.status_code == 403
- assert mock_send_grid.call_count == 0
+ res = app.post(url, payload, auth=moderator.auth, expect_errors=True)
+ assert res.status_code == 403
+ assert not notification
- def test_POST_admin_success_existing_user(self, mock_send_grid, app, url, nonmoderator, moderator, admin, provider):
+ def test_POST_admin_success_existing_user(self, app, url, nonmoderator, moderator, admin, provider):
payload = make_payload(user_id=nonmoderator._id, permission_group='moderator')
with capture_notifications() as notifications:
@@ -122,11 +122,13 @@ def test_POST_admin_success_existing_user(self, mock_send_grid, app, url, nonmod
assert res.json['data']['id'] == nonmoderator._id
assert res.json['data']['attributes']['permission_group'] == 'moderator'
- def test_POST_admin_failure_existing_moderator(self, mock_send_grid, app, url, moderator, admin, provider):
+ def test_POST_admin_failure_existing_moderator(self, app, url, moderator, admin, provider):
payload = make_payload(user_id=moderator._id, permission_group='moderator')
- res = app.post_json_api(url, payload, auth=admin.auth, expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth, expect_errors=True)
+ assert not notifications
+
assert res.status_code == 400
- assert mock_send_grid.call_count == 0
def test_POST_admin_failure_unreg_moderator(self, app, url, moderator, nonmoderator, admin, provider):
unreg_user = {'full_name': 'Jalen Hurts', 'email': '1eagles@allbatman.org'}
@@ -147,13 +149,14 @@ def test_POST_admin_failure_unreg_moderator(self, app, url, moderator, nonmodera
assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONFIRM_EMAIL_MODERATION
assert notifications[0]['kwargs']['user'].username == unreg_user['email']
- def test_POST_admin_failure_invalid_group(self, mock_send_grid, app, url, nonmoderator, moderator, admin, provider):
+ def test_POST_admin_failure_invalid_group(self, app, url, nonmoderator, moderator, admin, provider):
payload = make_payload(user_id=nonmoderator._id, permission_group='citizen')
- res = app.post_json_api(url, payload, auth=admin.auth, expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=admin.auth, expect_errors=True)
+ assert not notifications
assert res.status_code == 400
- assert mock_send_grid.call_count == 0
- def test_POST_admin_success_email(self, mock_send_grid, app, url, nonmoderator, moderator, admin, provider):
+ def test_POST_admin_success_email(self, app, url, nonmoderator, moderator, admin, provider):
payload = make_payload(email='somenewuser@gmail.com', full_name='Some User', permission_group='moderator')
with capture_notifications() as notifications:
res = app.post_json_api(url, payload, auth=admin.auth)
diff --git a/api_tests/providers/preprints/views/test_preprint_provider_moderator_list.py b/api_tests/providers/preprints/views/test_preprint_provider_moderator_list.py
index ac075faddeb..50713497203 100644
--- a/api_tests/providers/preprints/views/test_preprint_provider_moderator_list.py
+++ b/api_tests/providers/preprints/views/test_preprint_provider_moderator_list.py
@@ -10,7 +10,6 @@
from tests.utils import capture_notifications
-@pytest.mark.usefixtures('mock_send_grid')
class ProviderModeratorListTestClass:
@pytest.fixture()
@@ -70,18 +69,18 @@ def test_list_get_admin_with_filter(self, app, url, nonmoderator, moderator, adm
assert res.json['data'][0]['id'] == admin._id
assert res.json['data'][0]['attributes']['permission_group'] == permissions.ADMIN
- def test_list_post_unauthorized(self, mock_send_grid, app, url, nonmoderator, moderator, provider):
+ def test_list_post_unauthorized(self, app, url, nonmoderator, moderator, provider):
payload = self.create_payload(user_id=nonmoderator._id, permission_group='moderator')
- res = app.post(url, payload, expect_errors=True)
- assert res.status_code == 401
-
- res = app.post(url, payload, auth=nonmoderator.auth, expect_errors=True)
- assert res.status_code == 403
+ with capture_notifications() as notification:
+ res = app.post(url, payload, expect_errors=True)
+ assert res.status_code == 401
- res = app.post(url, payload, auth=moderator.auth, expect_errors=True)
- assert res.status_code == 403
+ res = app.post(url, payload, auth=nonmoderator.auth, expect_errors=True)
+ assert res.status_code == 403
- assert mock_send_grid.call_count == 0
+ res = app.post(url, payload, auth=moderator.auth, expect_errors=True)
+ assert res.status_code == 403
+ assert not notification
def test_list_post_admin_success_existing_user(self, app, url, nonmoderator, moderator, admin):
payload = self.create_payload(user_id=nonmoderator._id, permission_group='moderator')
@@ -94,7 +93,7 @@ def test_list_post_admin_success_existing_user(self, app, url, nonmoderator, mod
assert len(notifications) == 1
assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
- def test_list_post_admin_failure_existing_moderator(self, mock_send_grid, app, url, moderator, admin):
+ def test_list_post_admin_failure_existing_moderator(self, app, url, moderator, admin):
payload = self.create_payload(user_id=moderator._id, permission_group='moderator')
with capture_notifications() as notifications:
res = app.post_json_api(url, payload, auth=admin.auth, expect_errors=True)
diff --git a/api_tests/providers/tasks/test_bulk_upload.py b/api_tests/providers/tasks/test_bulk_upload.py
index 8caf27d89bf..a2863436bbd 100644
--- a/api_tests/providers/tasks/test_bulk_upload.py
+++ b/api_tests/providers/tasks/test_bulk_upload.py
@@ -65,7 +65,6 @@ def test_error_message_default(self):
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestBulkUploadTasks:
@pytest.fixture()
diff --git a/api_tests/registrations/views/test_registration_detail.py b/api_tests/registrations/views/test_registration_detail.py
index 68222090042..1be2d14c3be 100644
--- a/api_tests/registrations/views/test_registration_detail.py
+++ b/api_tests/registrations/views/test_registration_detail.py
@@ -693,7 +693,6 @@ def test_read_write_contributor_can_edit_writeable_fields(
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_notification_send')
class TestRegistrationWithdrawal(TestRegistrationUpdateTestCase):
@pytest.fixture
@@ -752,14 +751,16 @@ def test_initiate_withdraw_registration_fails(
res = app.put_json_api(public_url, public_payload, auth=user.auth, expect_errors=True)
assert res.status_code == 400
- def test_initiate_withdrawal_success(self, mock_notification_send, app, user, public_registration, public_url, public_payload):
- res = app.put_json_api(public_url, public_payload, auth=user.auth)
+ def test_initiate_withdrawal_success(self, app, user, public_registration, public_url, public_payload):
+ with capture_notifications() as notifications:
+ res = app.put_json_api(public_url, public_payload, auth=user.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_REVIEWS
assert res.status_code == 200
assert res.json['data']['attributes']['pending_withdrawal'] is True
public_registration.refresh_from_db()
assert public_registration.is_pending_retraction
assert public_registration.registered_from.logs.first().action == 'retraction_initiated'
- assert mock_notification_send.called
@pytest.mark.usefixtures('mock_gravy_valet_get_verified_links')
def test_initiate_withdrawal_with_embargo_ends_embargo(
diff --git a/api_tests/requests/views/test_node_request_institutional_access.py b/api_tests/requests/views/test_node_request_institutional_access.py
index 35e18042117..d41b7639f05 100644
--- a/api_tests/requests/views/test_node_request_institutional_access.py
+++ b/api_tests/requests/views/test_node_request_institutional_access.py
@@ -2,14 +2,15 @@
from api.base.settings.defaults import API_BASE
from api_tests.requests.mixins import NodeRequestTestMixin
+from osf.models import NotificationType
from osf_tests.factories import NodeFactory, InstitutionFactory, AuthUserFactory
from osf.utils.workflows import DefaultStates, NodeRequestTypes
from framework.auth import Auth
+from tests.utils import capture_notifications
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_notification_send')
class TestNodeRequestListInstitutionalAccess(NodeRequestTestMixin):
@pytest.fixture()
@@ -206,37 +207,34 @@ def test_institutional_admin_unauth_institution(self, app, project, institution_
assert res.status_code == 403
assert 'Institutional request access is not enabled.' in res.json['errors'][0]['detail']
- def test_email_not_sent_without_recipient(self, mock_notification_send, app, project, institutional_admin, url,
+ def test_email_not_sent_without_recipient(self, app, project, institutional_admin, url,
create_payload, institution):
"""
Test that an email is not sent when no recipient is listed when an institutional access request is made,
but the request is still made anyway without email.
"""
del create_payload['data']['relationships']['message_recipient']
- mock_notification_send.reset_mock()
- res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ # Check that an email is not sent
+ assert not notifications
assert res.status_code == 201
- # Check that an email is sent
- assert not mock_notification_send.called
-
- def test_email_not_sent_outside_institution(self, mock_notification_send, app, project, institutional_admin, url,
+ def test_email_not_sent_outside_institution(self, app, project, institutional_admin, url,
create_payload, user_without_affiliation, institution):
"""
Test that you are prevented from requesting a user with the correct institutional affiliation.
"""
create_payload['data']['relationships']['message_recipient']['data']['id'] = user_without_affiliation._id
- mock_notification_send.reset_mock()
- res = app.post_json_api(url, create_payload, auth=institutional_admin.auth, expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, create_payload, auth=institutional_admin.auth, expect_errors=True)
+ # Check that an email is not sent
+ assert not notifications
assert res.status_code == 403
assert f'User {user_without_affiliation._id} is not affiliated with the institution.' in res.json['errors'][0]['detail']
- # Check that an email is sent
- assert not mock_notification_send.called
-
def test_email_sent_on_creation(
self,
- mock_notification_send,
app,
project,
institutional_admin,
@@ -248,15 +246,14 @@ def test_email_sent_on_creation(
"""
Test that an email is sent to the appropriate recipients when an institutional access request is made.
"""
- mock_notification_send.reset_mock()
- res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert res.status_code == 201
- assert mock_notification_send.call_count == 1
-
def test_bcc_institutional_admin(
self,
- mock_notification_send,
app,
project,
institutional_admin,
@@ -269,15 +266,14 @@ def test_bcc_institutional_admin(
Ensure BCC option works as expected, sending messages to sender giving them a copy for themselves.
"""
create_payload['data']['attributes']['bcc_sender'] = True
- mock_notification_send.reset_mock()
- res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert res.status_code == 201
- assert mock_notification_send.call_count == 1
-
def test_reply_to_institutional_admin(
self,
- mock_notification_send,
app,
project,
institutional_admin,
@@ -290,12 +286,12 @@ def test_reply_to_institutional_admin(
Ensure reply-to option works as expected, allowing a reply to header be added to the email.
"""
create_payload['data']['attributes']['reply_to'] = True
- mock_notification_send.reset_mock()
- res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert res.status_code == 201
- assert mock_notification_send.call_count == 1
-
def test_access_requests_disabled_raises_permission_denied(
self, app, node_with_disabled_access_requests, user_with_affiliation, institutional_admin, create_payload
):
@@ -313,7 +309,6 @@ def test_access_requests_disabled_raises_permission_denied(
def test_placeholder_text_when_comment_is_empty(
self,
- mock_notification_send,
app,
project,
institutional_admin,
@@ -327,12 +322,12 @@ def test_placeholder_text_when_comment_is_empty(
"""
# Test with empty comment
create_payload['data']['attributes']['comment'] = ''
- mock_notification_send.reset_mock()
- res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, create_payload, auth=institutional_admin.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert res.status_code == 201
- mock_notification_send.assert_called()
-
def test_requester_can_resubmit(self, app, project, institutional_admin, url, create_payload):
"""
Test that a requester can submit another access request for the same node.
diff --git a/api_tests/requests/views/test_node_request_list.py b/api_tests/requests/views/test_node_request_list.py
index 4e16d5ce1c2..1356727d2f7 100644
--- a/api_tests/requests/views/test_node_request_list.py
+++ b/api_tests/requests/views/test_node_request_list.py
@@ -10,7 +10,6 @@
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestNodeRequestListCreate(NodeRequestTestMixin):
@pytest.fixture()
def url(self, project):
diff --git a/api_tests/requests/views/test_preprint_request_list.py b/api_tests/requests/views/test_preprint_request_list.py
index 72e16862f7a..2a859e33ef8 100644
--- a/api_tests/requests/views/test_preprint_request_list.py
+++ b/api_tests/requests/views/test_preprint_request_list.py
@@ -5,7 +5,6 @@
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestPreprintRequestListCreate(PreprintRequestTestMixin):
def url(self, preprint):
return f'/{API_BASE}preprints/{preprint._id}/requests/'
@@ -63,9 +62,3 @@ def test_requester_cannot_submit_again(self, app, admin, create_payload, pre_mod
res = app.post_json_api(self.url(pre_mod_preprint), create_payload, auth=admin.auth, expect_errors=True)
assert res.status_code == 409
assert res.json['errors'][0]['detail'] == 'Users may not have more than one withdrawal request per preprint.'
-
- @pytest.mark.skip('TODO: IN-284 -- add emails')
- def test_email_sent_to_moderators_on_submit(self, mock_send_grid, app, admin, create_payload, moderator, post_mod_preprint):
- res = app.post_json_api(self.url(post_mod_preprint), create_payload, auth=admin.auth)
- assert res.status_code == 201
- assert mock_send_grid.call_count == 1
diff --git a/api_tests/requests/views/test_request_actions_create.py b/api_tests/requests/views/test_request_actions_create.py
index 7396e1ec739..ff277ac0233 100644
--- a/api_tests/requests/views/test_request_actions_create.py
+++ b/api_tests/requests/views/test_request_actions_create.py
@@ -10,7 +10,6 @@
@pytest.mark.django_db
@pytest.mark.enable_enqueue_task
-@pytest.mark.usefixtures('mock_send_grid')
class TestCreateNodeRequestAction(NodeRequestTestMixin):
@pytest.fixture()
def url(self, node_request):
@@ -220,17 +219,17 @@ def test_email_sent_on_reject(self, app, admin, url, node_request):
assert initial_state != node_request.machine_state
assert node_request.creator not in node_request.target.contributors
- def test_email_not_sent_on_reject(self, mock_send_grid, app, requester, url, node_request):
- mock_send_grid.reset_mock()
+ def test_email_not_sent_on_reject(self, app, requester, url, node_request):
initial_state = node_request.machine_state
initial_comment = node_request.comment
payload = self.create_payload(node_request._id, trigger='edit_comment', comment='ASDFG')
- res = app.post_json_api(url, payload, auth=requester.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url, payload, auth=requester.auth)
+ assert not notifications
assert res.status_code == 201
node_request.reload()
assert initial_state == node_request.machine_state
assert initial_comment != node_request.comment
- assert mock_send_grid.call_count == 0
def test_set_permissions_on_approve(self, app, admin, url, node_request):
assert node_request.creator not in node_request.target.contributors
@@ -261,7 +260,6 @@ def test_accept_request_defaults_to_read_and_visible(self, app, admin, url, node
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestCreatePreprintRequestAction(PreprintRequestTestMixin):
@pytest.fixture()
def url(self, pre_request, post_request, none_request):
@@ -407,31 +405,5 @@ def test_email_sent_on_approve(self, app, moderator, url, pre_request, post_requ
assert initial_state != request.machine_state
assert request.target.is_retracted
- @pytest.mark.skip('TODO: IN-331 -- add emails')
- def test_email_sent_on_reject(self, mock_send_grid, app, moderator, url, pre_request, post_request):
- for request in [pre_request, post_request]:
- initial_state = request.machine_state
- assert not request.target.is_retracted
- payload = self.create_payload(request._id, trigger='reject')
- res = app.post_json_api(url, payload, auth=moderator.auth)
- assert res.status_code == 201
- request.reload()
- assert initial_state != request.machine_state
- assert not request.target.is_retracted
- assert mock_send_grid.call_count == 2
-
- @pytest.mark.skip('TODO: IN-284/331 -- add emails')
- def test_email_not_sent_on_edit_comment(self, mock_send_grid, app, moderator, url, pre_request, post_request):
- for request in [pre_request, post_request]:
- initial_state = request.machine_state
- assert not request.target.is_retracted
- payload = self.create_payload(request._id, trigger='edit_comment', comment='ASDFG')
- res = app.post_json_api(url, payload, auth=moderator.auth)
- assert res.status_code == 201
- request.reload()
- assert initial_state != request.machine_state
- assert not request.target.is_retracted
- assert mock_send_grid.call_count == 0
-
def test_auto_approve(self, app, auto_withdrawable_pre_mod_preprint, auto_approved_pre_request):
assert auto_withdrawable_pre_mod_preprint.is_retracted
diff --git a/api_tests/users/views/test_user_confirm.py b/api_tests/users/views/test_user_confirm.py
index d304fc456b5..bb2acee47c9 100644
--- a/api_tests/users/views/test_user_confirm.py
+++ b/api_tests/users/views/test_user_confirm.py
@@ -1,12 +1,12 @@
import pytest
-from unittest import mock
from api.base.settings.defaults import API_BASE
+from osf.models import NotificationType
from osf_tests.factories import AuthUserFactory
+from tests.utils import capture_notifications
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_notification_send')
class TestConfirmEmail:
@pytest.fixture()
@@ -114,26 +114,26 @@ def test_post_provider_mismatch(self, app, confirm_url, user_with_email_verifica
assert res.status_code == 400
assert 'provider mismatch' in res.json['errors'][0]['detail'].lower()
- @mock.patch('website.mails.send_mail')
- def test_post_success_create(self, mock_send_mail, app, confirm_url, user_with_email_verification):
+ def test_post_success_create(self, app, confirm_url, user_with_email_verification):
user, token, email = user_with_email_verification
user.is_registered = False
user.save()
- res = app.post_json_api(
- confirm_url,
- {
- 'data': {
- 'attributes': {
- 'uid': user._id,
- 'token': token,
- 'destination': 'doesnotmatter',
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ confirm_url,
+ {
+ 'data': {
+ 'attributes': {
+ 'uid': user._id,
+ 'token': token,
+ 'destination': 'doesnotmatter',
+ }
}
- }
- },
- expect_errors=True
- )
+ },
+ expect_errors=True
+ )
assert res.status_code == 201
- assert not mock_send_mail.called
+ assert not notifications
assert res.json == {
'redirect_url': f'http://localhost:80/v2/users/{user._id}/confirm/&new=true',
'meta': {
@@ -148,62 +148,61 @@ def test_post_success_create(self, mock_send_mail, app, confirm_url, user_with_e
assert user.external_identity == {'ORCID': {'0002-0001-0001-0001': 'VERIFIED'}}
assert user.emails.filter(address=email.lower()).exists()
- def test_post_success_link(self, mock_notification_send, app, confirm_url, user_with_email_verification):
+ def test_post_success_link(self, app, confirm_url, user_with_email_verification):
user, token, email = user_with_email_verification
user.external_identity['ORCID']['0000-0000-0000-0000'] = 'LINK'
user.save()
- res = app.post_json_api(
- confirm_url,
- {
- 'data': {
- 'attributes': {
- 'uid': user._id,
- 'token': token,
- 'destination': 'doesnotmatter'
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ confirm_url,
+ {
+ 'data': {
+ 'attributes': {
+ 'uid': user._id,
+ 'token': token,
+ 'destination': 'doesnotmatter'
+ }
}
- }
- },
- expect_errors=True
- )
- assert res.status_code == 201
+ },
+ expect_errors=True
+ )
+ assert res.status_code == 201
- assert mock_notification_send.called
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
user.reload()
assert user.external_identity['ORCID']['0000-0000-0000-0000'] == 'VERIFIED'
- @mock.patch('website.mails.send_mail')
def test_post_success_link_with_email_verification_none(
- self, mock_send_mail, app, confirm_url, user_with_none_identity
+ self, app, confirm_url, user_with_none_identity
):
user, token, email = user_with_none_identity
user.save()
- res = app.post_json_api(
- confirm_url,
- {
- 'data': {
- 'attributes': {
- 'uid': user._id,
- 'token': token,
- 'destination': 'doesnotmatter'
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ confirm_url,
+ {
+ 'data': {
+ 'attributes': {
+ 'uid': user._id,
+ 'token': token,
+ 'destination': 'doesnotmatter'
+ }
}
- }
- },
- expect_errors=True
- )
+ },
+ expect_errors=True
+ )
+ assert not notifications # no orcid sso message
assert res.status_code == 201
- assert not mock_send_mail.called # no orcid sso message
-
user.reload()
assert not user.external_identity
- @mock.patch('website.mails.send_mail')
def test_post_success_link_with_email_already_exists(
self,
- mock_send_mail,
app,
confirm_url,
user_with_email_verification
diff --git a/api_tests/users/views/test_user_list.py b/api_tests/users/views/test_user_list.py
index 32cc69758d4..28a913df2a7 100644
--- a/api_tests/users/views/test_user_list.py
+++ b/api_tests/users/views/test_user_list.py
@@ -10,7 +10,7 @@
from api.base.settings.defaults import API_BASE
from framework.auth.cas import CasResponse
-from osf.models import OSFUser, ApiOAuth2PersonalToken
+from osf.models import OSFUser, ApiOAuth2PersonalToken, NotificationType
from osf_tests.factories import (
AuthUserFactory,
UserFactory,
@@ -19,6 +19,7 @@
Auth,
)
from osf.utils.permissions import CREATOR_PERMISSIONS
+from tests.utils import capture_notifications
from website import settings
@@ -246,7 +247,6 @@ def test_users_list_filter_multiple_fields_with_bad_filter(
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestUsersCreate:
@pytest.fixture()
@@ -279,35 +279,37 @@ def tearDown(self, app):
OSFUser.remove()
def test_logged_in_user_with_basic_auth_cannot_create_other_user_or_send_mail(
- self, mock_send_grid, app, user, email_unconfirmed, data, url_base):
+ self, app, user, email_unconfirmed, data, url_base):
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- res = app.post_json_api(
- f'{url_base}?send_email=true',
- data,
- auth=user.auth,
- expect_errors=True
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ f'{url_base}?send_email=true',
+ data,
+ auth=user.auth,
+ expect_errors=True
+ )
+ assert not notifications
assert res.status_code == 403
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- assert mock_send_grid.call_count == 0
def test_logged_out_user_cannot_create_other_user_or_send_mail(
- self, mock_send_grid, app, email_unconfirmed, data, url_base):
+ self, app, email_unconfirmed, data, url_base):
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- res = app.post_json_api(
- f'{url_base}?send_email=true',
- data,
- expect_errors=True
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ f'{url_base}?send_email=true',
+ data,
+ expect_errors=True
+ )
+ assert not notifications
assert res.status_code == 401
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- assert mock_send_grid.call_count == 0
@pytest.mark.skip # failing locally post converision
def test_cookied_requests_can_create_and_email(
- self, mock_send_grid, app, user, email_unconfirmed, data, url_base):
+ self, app, user, email_unconfirmed, data, url_base):
# NOTE: skipped tests are not tested during session refactor, only updated to fix import
session = SessionStore()
session['auth_user_id'] = user._id
@@ -316,13 +318,15 @@ def test_cookied_requests_can_create_and_email(
app.set_cookie(settings.COOKIE_NAME, str(cookie))
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- res = app.post_json_api(
- f'{url_base}?send_email=true',
- data
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ f'{url_base}?send_email=true',
+ data
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert res.status_code == 201
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 1
- assert mock_send_grid.call_count == 1
@pytest.mark.skip # failing locally post converision
@mock.patch('api.base.authentication.drf.OSFCASAuthentication.authenticate')
@@ -331,7 +335,7 @@ def test_cookied_requests_can_create_and_email(
not settings.DEV_MODE,
'DEV_MODE disabled, osf.users.create unavailable')
def test_properly_scoped_token_can_create_and_send_email(
- self, mock_auth, mock_send_grid, app, user, email_unconfirmed, data, url_base):
+ self, mock_auth, app, user, email_unconfirmed, data, url_base):
token = ApiOAuth2PersonalToken(
owner=user,
name='Authorized Token',
@@ -352,16 +356,18 @@ def test_properly_scoped_token_can_create_and_send_email(
mock_auth.return_value = user, mock_cas_resp
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- res = app.post_json_api(
- f'{url_base}?send_email=true',
- data,
- headers={'Authorization': f'Bearer {token.token_id}'}
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ f'{url_base}?send_email=true',
+ data,
+ headers={'Authorization': f'Bearer {token.token_id}'}
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert res.status_code == 201
assert res.json['data']['attributes']['username'] == email_unconfirmed
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 1
- assert mock_send_grid.call_count == 1
@pytest.mark.skip # failing locally post converision
@mock.patch('api.base.authentication.drf.OSFCASAuthentication.authenticate')
@@ -370,7 +376,7 @@ def test_properly_scoped_token_can_create_and_send_email(
not settings.DEV_MODE,
'DEV_MODE disabled, osf.users.create unavailable')
def test_properly_scoped_token_does_not_send_email_without_kwarg(
- self, mock_auth, mock_send_grid, app, user, email_unconfirmed, data, url_base):
+ self, mock_auth, app, user, email_unconfirmed, data, url_base):
token = ApiOAuth2PersonalToken(
owner=user,
name='Authorized Token',
@@ -393,16 +399,17 @@ def test_properly_scoped_token_does_not_send_email_without_kwarg(
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- res = app.post_json_api(
- url_base,
- data,
- headers={'Authorization': f'Bearer {token.token_id}'}
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url_base,
+ data,
+ headers={'Authorization': f'Bearer {token.token_id}'}
+ )
+ assert not notifications
assert res.status_code == 201
assert res.json['data']['attributes']['username'] == email_unconfirmed
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 1
- assert mock_send_grid.call_count == 0
@pytest.mark.skip # failing locally post converision
@mock.patch('api.base.authentication.drf.OSFCASAuthentication.authenticate')
@@ -411,7 +418,7 @@ def test_properly_scoped_token_does_not_send_email_without_kwarg(
not settings.DEV_MODE,
'DEV_MODE disabled, osf.users.create unavailable')
def test_properly_scoped_token_can_create_without_username_but_not_send_email(
- self, mock_auth, mock_send_grid, app, user, data, url_base):
+ self, mock_auth, app, user, data, url_base):
token = ApiOAuth2PersonalToken(
owner=user,
name='Authorized Token',
@@ -434,11 +441,13 @@ def test_properly_scoped_token_can_create_without_username_but_not_send_email(
data['data']['attributes'] = {'full_name': 'No Email'}
assert OSFUser.objects.filter(fullname='No Email').count() == 0
- res = app.post_json_api(
- f'{url_base}?send_email=true',
- data,
- headers={'Authorization': f'Bearer {token.token_id}'}
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ f'{url_base}?send_email=true',
+ data,
+ headers={'Authorization': f'Bearer {token.token_id}'}
+ )
+ assert not notifications
assert res.status_code == 201
username = res.json['data']['attributes']['username']
@@ -447,11 +456,10 @@ def test_properly_scoped_token_can_create_without_username_but_not_send_email(
except ValueError:
raise AssertionError('Username is not a valid UUID')
assert OSFUser.objects.filter(fullname='No Email').count() == 1
- assert mock_send_grid.call_count == 0
@mock.patch('api.base.authentication.drf.OSFCASAuthentication.authenticate')
def test_improperly_scoped_token_can_not_create_or_email(
- self, mock_auth, mock_send_grid, app, user, email_unconfirmed, data, url_base):
+ self, mock_auth, app, user, email_unconfirmed, data, url_base):
token = ApiOAuth2PersonalToken(
owner=user,
name='Unauthorized Token',
@@ -474,16 +482,17 @@ def test_improperly_scoped_token_can_not_create_or_email(
mock_auth.return_value = user, mock_cas_resp
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- res = app.post_json_api(
- f'{url_base}?send_email=true',
- data,
- headers={'Authorization': f'Bearer {token.token_id}'},
- expect_errors=True
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ f'{url_base}?send_email=true',
+ data,
+ headers={'Authorization': f'Bearer {token.token_id}'},
+ expect_errors=True
+ )
+ assert not notifications
assert res.status_code == 403
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- assert mock_send_grid.call_count == 0
@pytest.mark.skip # failing locally post converision
@mock.patch('api.base.authentication.drf.OSFCASAuthentication.authenticate')
@@ -492,7 +501,7 @@ def test_improperly_scoped_token_can_not_create_or_email(
not settings.DEV_MODE,
'DEV_MODE disabled, osf.admin unavailable')
def test_admin_scoped_token_can_create_and_send_email(
- self, mock_auth, mock_send_grid, app, user, email_unconfirmed, data, url_base):
+ self, mock_auth, app, user, email_unconfirmed, data, url_base):
token = ApiOAuth2PersonalToken(
owner=user,
name='Admin Token',
@@ -513,13 +522,15 @@ def test_admin_scoped_token_can_create_and_send_email(
mock_auth.return_value = user, mock_cas_resp
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 0
- res = app.post_json_api(
- f'{url_base}?send_email=true',
- data,
- headers={'Authorization': f'Bearer {token.token_id}'}
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ f'{url_base}?send_email=true',
+ data,
+ headers={'Authorization': f'Bearer {token.token_id}'}
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
assert res.status_code == 201
assert res.json['data']['attributes']['username'] == email_unconfirmed
assert OSFUser.objects.filter(username=email_unconfirmed).count() == 1
- assert mock_send_grid.call_count == 1
diff --git a/api_tests/users/views/test_user_message_institutional_access.py b/api_tests/users/views/test_user_message_institutional_access.py
index 2f60c4ae726..aac978abeb0 100644
--- a/api_tests/users/views/test_user_message_institutional_access.py
+++ b/api_tests/users/views/test_user_message_institutional_access.py
@@ -1,4 +1,6 @@
import pytest
+
+from osf.models.notification_type import NotificationType
from osf.models.user_message import MessageTypes, UserMessage
from api.base.settings.defaults import API_BASE
from osf_tests.factories import (
@@ -7,9 +9,9 @@
)
from webtest import AppError
+from tests.utils import capture_notifications
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestUserMessageInstitutionalAccess:
"""
Tests for `UserMessage`.
@@ -84,31 +86,35 @@ def payload(self, institution, user):
}
}
- def test_institutional_admin_can_create_message(self, mock_send_grid, app, institutional_admin, institution, url_with_affiliation, payload):
+ def test_institutional_admin_can_create_message(self, app, institutional_admin, institution, url_with_affiliation, payload):
"""
Ensure an institutional admin can create a `UserMessage` with a `message` and `institution`.
"""
-
- res = app.post_json_api(
- url_with_affiliation,
- payload,
- auth=institutional_admin.auth
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url_with_affiliation,
+ payload,
+ auth=institutional_admin.auth
+ )
+ assert len(notifications) == 1
+ user_message = UserMessage.objects.get(sender=institutional_admin)
+ assert notifications[0]['kwargs']['user'].username == user_message.recipient.username
assert res.status_code == 201
data = res.json['data']
- user_message = UserMessage.objects.get(sender=institutional_admin)
-
assert user_message.message_text == payload['data']['attributes']['message_text']
assert user_message.institution == institution
- mock_send_grid.assert_called_once()
- assert mock_send_grid.call_args[1]['to_addr'] == user_message.recipient.username
assert user_message._id == data['id']
- def test_institutional_admin_can_not_create_message(self, mock_send_grid, app, institutional_admin_on_institution_without_access,
- institution_without_access, url_with_affiliation_on_institution_without_access,
- payload):
+ def test_institutional_admin_can_not_create_message(
+ self,
+ app,
+ institutional_admin_on_institution_without_access,
+ institution_without_access,
+ url_with_affiliation_on_institution_without_access,
+ payload
+ ):
"""
Ensure an institutional admin cannot create a `UserMessage` with a `message` and `institution` witch has 'institutional_request_access_enabled' as False
"""
@@ -193,7 +199,6 @@ def test_admin_cannot_message_user_outside_institution(
def test_cc_institutional_admin(
self,
- mock_send_grid,
app,
institutional_admin,
institution,
@@ -208,42 +213,46 @@ def test_cc_institutional_admin(
# Enable CC in the payload
payload['data']['attributes']['bcc_sender'] = True
- # Perform the API request
- res = app.post_json_api(
- url_with_affiliation,
- payload,
- auth=institutional_admin.auth,
- )
+ with capture_notifications() as notifications:
+ # Perform the API request
+ res = app.post_json_api(
+ url_with_affiliation,
+ payload,
+ auth=institutional_admin.auth,
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[0]['kwargs']['user'].username == user_with_affiliation.username
assert res.status_code == 201
user_message = UserMessage.objects.get()
-
assert user_message.is_sender_BCCed
# Two emails are sent during the CC but this is how the mock works `send_email` is called once.
- assert mock_send_grid.call_args[1]['to_addr'] == user_with_affiliation.username
- def test_cc_field_defaults_to_false(self, mock_send_grid, app, institutional_admin, url_with_affiliation, user_with_affiliation, institution, payload):
+ def test_cc_field_defaults_to_false(self, app, institutional_admin, url_with_affiliation, user_with_affiliation, institution, payload):
"""
Ensure the `cc` field defaults to `false` when not provided in the payload.
"""
- res = app.post_json_api(url_with_affiliation, payload, auth=institutional_admin.auth)
+ with capture_notifications() as notifications:
+ res = app.post_json_api(url_with_affiliation, payload, auth=institutional_admin.auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[0]['kwargs']['user'].username == user_with_affiliation.username
assert res.status_code == 201
user_message = UserMessage.objects.get(sender=institutional_admin)
assert user_message.message_text == payload['data']['attributes']['message_text']
- assert mock_send_grid.call_args[1]['to_addr'] == user_with_affiliation.username
-
- def test_reply_to_header_set(self, mock_send_grid, app, institutional_admin, user_with_affiliation, institution, url_with_affiliation, payload):
+ def test_reply_to_header_set(self, app, institutional_admin, user_with_affiliation, institution, url_with_affiliation, payload):
"""
Ensure that the 'Reply-To' header is correctly set to the sender's email address.
"""
payload['data']['attributes']['reply_to'] = True
- res = app.post_json_api(
- url_with_affiliation,
- payload,
- auth=institutional_admin.auth,
- )
+ with capture_notifications() as notifications:
+ res = app.post_json_api(
+ url_with_affiliation,
+ payload,
+ auth=institutional_admin.auth,
+ )
assert res.status_code == 201
-
- assert mock_send_grid.call_args[1]['to_addr'] == user_with_affiliation.username
+ assert notifications[0]['user'].username == user_with_affiliation.username
diff --git a/api_tests/users/views/test_user_settings.py b/api_tests/users/views/test_user_settings.py
index eac3bc9fc0c..847576d9913 100644
--- a/api_tests/users/views/test_user_settings.py
+++ b/api_tests/users/views/test_user_settings.py
@@ -7,8 +7,10 @@
AuthUserFactory,
UserFactory,
)
-from osf.models import Email, NotableDomain
+from osf.models import Email, NotableDomain, NotificationType
from framework.auth.views import auth_email_logout
+from tests.utils import capture_notifications
+
@pytest.fixture()
def user_one():
@@ -25,7 +27,6 @@ def unconfirmed_address():
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestUserRequestExport:
@pytest.fixture()
@@ -41,11 +42,11 @@ def payload(self):
}
}
- def test_get(self, app, user_one, url, mock_notification_send):
+ def test_get(self, app, user_one, url):
res = app.get(url, auth=user_one.auth, expect_errors=True)
assert res.status_code == 405
- def test_post(self, mock_send_grid, app, user_one, user_two, url, payload):
+ def test_post(self, app, user_one, user_two, url, payload):
# Logged out
res = app.post_json_api(url, payload, expect_errors=True)
assert res.status_code == 401
@@ -56,20 +57,23 @@ def test_post(self, mock_send_grid, app, user_one, user_two, url, payload):
# Logged in
assert user_one.email_last_sent is None
- res = app.post_json_api(url, payload, auth=user_one.auth)
+ with capture_notifications() as notification:
+ res = app.post_json_api(url, payload, auth=user_one.auth)
+ assert len(notification) == 1
+ assert notification[0]['type'] == NotificationType.Type.USER_ACCOUNT_EXPORT_FORM
assert res.status_code == 204
user_one.reload()
assert user_one.email_last_sent is not None
- assert mock_send_grid.call_count == 1
- def test_post_invalid_type(self, mock_send_grid, app, user_one, url, payload):
+ def test_post_invalid_type(self, app, user_one, url, payload):
assert user_one.email_last_sent is None
payload['data']['type'] = 'Invalid Type'
- res = app.post_json_api(url, payload, auth=user_one.auth, expect_errors=True)
+ with capture_notifications() as notification:
+ res = app.post_json_api(url, payload, auth=user_one.auth, expect_errors=True)
+ assert not notification
assert res.status_code == 409
user_one.reload()
assert user_one.email_last_sent is None
- assert mock_send_grid.call_count == 0
def test_exceed_throttle(self, app, user_one, url, payload):
assert user_one.email_last_sent is None
diff --git a/api_tests/users/views/test_user_settings_detail.py b/api_tests/users/views/test_user_settings_detail.py
index cc02e6ae145..02fa4c2e646 100644
--- a/api_tests/users/views/test_user_settings_detail.py
+++ b/api_tests/users/views/test_user_settings_detail.py
@@ -4,6 +4,7 @@
from osf_tests.factories import (
AuthUserFactory,
)
+from tests.utils import capture_notifications
from website.settings import MAILCHIMP_GENERAL_LIST, OSF_HELP_LIST
@@ -227,7 +228,6 @@ def test_unauthorized_patch_403(self, app, url, payload, user_two):
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestUpdateRequestedDeactivation:
@pytest.fixture()
@@ -271,14 +271,15 @@ def test_patch_requested_deactivation(self, app, user_one, user_two, url, payloa
user_one.reload()
assert user_one.requested_deactivation is False
- def test_patch_invalid_type(self, mock_send_grid, app, user_one, url, payload):
+ def test_patch_invalid_type(self, app, user_one, url, payload):
assert user_one.email_last_sent is None
payload['data']['type'] = 'Invalid Type'
- res = app.patch_json_api(url, payload, auth=user_one.auth, expect_errors=True)
+ with capture_notifications() as notifications:
+ res = app.patch_json_api(url, payload, auth=user_one.auth, expect_errors=True)
+ assert not notifications
assert res.status_code == 409
user_one.reload()
assert user_one.email_last_sent is None
- assert mock_send_grid.call_count == 0
def test_exceed_throttle(self, app, user_one, url, payload):
assert user_one.email_last_sent is None
diff --git a/api_tests/users/views/test_user_settings_reset_password.py b/api_tests/users/views/test_user_settings_reset_password.py
index 94730ec4fa9..0dbdbaec996 100644
--- a/api_tests/users/views/test_user_settings_reset_password.py
+++ b/api_tests/users/views/test_user_settings_reset_password.py
@@ -3,13 +3,15 @@
from api.base.settings.defaults import API_BASE
from api.base.settings import CSRF_COOKIE_NAME
+from osf.models import NotificationType
from osf_tests.factories import (
UserFactory,
)
from django.middleware import csrf
-@pytest.mark.usefixtures('mock_send_grid')
-@pytest.mark.usefixtures('mock_notification_send')
+from tests.utils import capture_notifications
+
+
class TestResetPassword:
@pytest.fixture()
@@ -28,20 +30,22 @@ def url(self):
def csrf_token(self):
return csrf._mask_cipher_secret(csrf._get_new_csrf_string())
- def test_get(self, mock_notification_send, app, url, user_one):
+ def test_get(self, app, url, user_one):
encoded_email = urllib.parse.quote(user_one.email)
url = f'{url}?email={encoded_email}'
- res = app.get(url)
+ with capture_notifications() as notification:
+ res = app.get(url)
+ assert len(notification) == 1
+ assert notification[0]['type'] == NotificationType.Type.RESET_PASSWORD_CONFIRMATION
assert res.status_code == 200
-
user_one.reload()
- assert mock_notification_send.called
- def test_get_invalid_email(self, mock_send_grid, app, url):
+ def test_get_invalid_email(self, app, url):
url = f'{url}?email={'invalid_email'}'
- res = app.get(url)
+ with capture_notifications() as notification:
+ res = app.get(url)
+ assert not notification
assert res.status_code == 200
- assert not mock_send_grid.called
def test_post(self, app, url, user_one, csrf_token):
app.set_cookie(CSRF_COOKIE_NAME, csrf_token)
diff --git a/conftest.py b/conftest.py
index f7b7bf72b07..b30cb6271a1 100644
--- a/conftest.py
+++ b/conftest.py
@@ -363,22 +363,6 @@ def helpful_thing(self):
yield from rolledback_transaction('function_transaction')
-@pytest.fixture()
-def mock_send_grid():
- with mock.patch.object(website_settings, 'USE_EMAIL', True):
- with mock.patch.object(website_settings, 'USE_CELERY', False):
- with mock.patch('framework.email.tasks.send_email') as mock_sendgrid:
- mock_sendgrid.return_value = True
- yield mock_sendgrid
-
-
-def start_mock_send_grid(test_case):
- patcher = mock.patch('framework.email.tasks.send_email')
- mocked_send = patcher.start()
- test_case.addCleanup(patcher.stop)
- mocked_send.return_value = True
- return mocked_send
-
@pytest.fixture
def mock_gravy_valet_get_verified_links():
"""This fixture is used to mock a GV request which is made during node's identifier update. More specifically, when
@@ -394,23 +378,6 @@ def mock_gravy_valet_get_verified_links():
yield mock_get_verified_links
-@pytest.fixture()
-def mock_notification_send():
- with mock.patch.object(website_settings, 'USE_EMAIL', True):
- with mock.patch.object(website_settings, 'USE_CELERY', False):
- with mock.patch('osf.models.notification.Notification.send') as mock_emit:
- mock_emit.return_value = None # Or True, if needed
- yield mock_emit
-
-
-def start_mock_notification_send(test_case):
- patcher = mock.patch('osf.models.notification.Notification.send')
- mocked_emit = patcher.start()
- test_case.addCleanup(patcher.stop)
- mocked_emit.return_value = None
- return mocked_emit
-
-
@pytest.fixture(autouse=True)
def load_notification_types(db, *args, **kwargs):
populate_notification_types(*args, **kwargs)
diff --git a/docker-compose.yml b/docker-compose.yml
index e9ba66bc37e..7c0f08992d1 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -67,9 +67,9 @@ services:
- elasticsearch_data_vol:/usr/share/elasticsearch/data
stdin_open: true
- # Temporary: Remove when we've upgraded to ES6
elasticsearch6:
- image: docker.elastic.co/elasticsearch/elasticsearch:6.3.1
+ image: quay.io/centerforopenscience/elasticsearch:es6-arm-6.3.1
+ platform: linux/arm64
ports:
- 9201:9200
volumes:
diff --git a/framework/email/__init__.py b/framework/email/__init__.py
deleted file mode 100644
index e69de29bb2d..00000000000
diff --git a/framework/email/tasks.py b/framework/email/tasks.py
deleted file mode 100644
index cf43395222e..00000000000
--- a/framework/email/tasks.py
+++ /dev/null
@@ -1,227 +0,0 @@
-import logging
-import smtplib
-from base64 import b64encode
-from email.mime.text import MIMEText
-from io import BytesIO
-
-from sendgrid import SendGridAPIClient
-from sendgrid.helpers.mail import (
- Mail,
- Bcc,
- ReplyTo,
- Category,
- Attachment,
- FileContent,
- Email,
- To,
- Personalization,
- Cc,
- FileName,
- Disposition,
-)
-
-
-from framework import sentry
-from framework.celery_tasks import app
-from website import settings
-
-logger = logging.getLogger(__name__)
-
-
-@app.task
-def send_email(
- from_addr: str,
- to_addr: str,
- subject: str,
- message: str,
- reply_to: bool = False,
- ttls: bool = True,
- login: bool = True,
- bcc_addr: [] = None,
- username: str = None,
- password: str = None,
- categories=None,
- attachment_name: str = None,
- attachment_content: str | bytes | BytesIO = None,
-):
- """Send email to specified destination.
- Email is sent from the email specified in FROM_EMAIL settings in the
- settings module.
-
- Uses the Sendgrid API if ``settings.SENDGRID_API_KEY`` is set.
-
- :param from_addr: A string, the sender email
- :param to_addr: A string, the recipient
- :param subject: subject of email
- :param message: body of message
- :param categories: Categories to add to the email using SendGrid's
- SMTPAPI. Used for email analytics.
- See https://sendgrid.com/docs/User_Guide/Statistics/categories.html
- This parameter is only respected if using the Sendgrid API.
- ``settings.SENDGRID_API_KEY`` must be set.
-
- :return: True if successful
- """
- if not settings.USE_EMAIL:
- return
- if settings.SENDGRID_API_KEY:
- return _send_with_sendgrid(
- from_addr=from_addr,
- to_addr=to_addr,
- subject=subject,
- message=message,
- categories=categories,
- attachment_name=attachment_name,
- attachment_content=attachment_content,
- reply_to=reply_to,
- bcc_addr=bcc_addr,
- )
- else:
- return _send_with_smtp(
- from_addr=from_addr,
- to_addr=to_addr,
- subject=subject,
- message=message,
- ttls=ttls,
- login=login,
- username=username,
- password=password,
- reply_to=reply_to,
- bcc_addr=bcc_addr,
- )
-
-
-def _send_with_smtp(
- from_addr,
- to_addr,
- subject,
- message,
- ttls=True,
- login=True,
- username=None,
- password=None,
- bcc_addr=None,
- reply_to=None,
-):
- username = username or settings.MAIL_USERNAME
- password = password or settings.MAIL_PASSWORD
-
- if login and (username is None or password is None):
- logger.error('Mail username and password not set; skipping send.')
- return False
-
- msg = MIMEText(
- message,
- 'html',
- _charset='utf-8',
- )
- msg['Subject'] = subject
- msg['From'] = from_addr
- msg['To'] = to_addr
-
- if reply_to:
- msg['Reply-To'] = reply_to
-
- # Combine recipients for SMTP
- recipients = [to_addr] + (bcc_addr or [])
-
- # Establish SMTP connection and send the email
- with smtplib.SMTP(settings.MAIL_SERVER) as server:
- server.ehlo()
- if ttls:
- server.starttls()
- server.ehlo()
- if login:
- server.login(username, password)
- server.sendmail(
- from_addr=from_addr,
- to_addrs=recipients,
- msg=msg.as_string()
- )
- return True
-
-
-def _send_with_sendgrid(
- from_addr: str,
- to_addr: str,
- subject: str,
- message: str,
- categories=None,
- attachment_name: str = None,
- attachment_content=None,
- cc_addr=None,
- bcc_addr=None,
- reply_to=None,
- client=None,
-):
- in_allowed_list = to_addr in settings.SENDGRID_EMAIL_WHITELIST
- if settings.SENDGRID_WHITELIST_MODE and not in_allowed_list:
- sentry.log_message(
- f'SENDGRID_WHITELIST_MODE is True. Failed to send emails to non-whitelisted recipient {to_addr}.'
- )
- return False
-
- client = client or SendGridAPIClient(settings.SENDGRID_API_KEY)
- mail = Mail(
- from_email=Email(from_addr),
- html_content=message,
- subject=subject,
- )
-
- # Personalization to handle To, CC, and BCC sendgrid client concept
- personalization = Personalization()
-
- personalization.add_to(To(to_addr))
-
- if cc_addr:
- if isinstance(cc_addr, str):
- cc_addr = [cc_addr]
- for email in cc_addr:
- personalization.add_cc(Cc(email))
-
- if bcc_addr:
- if isinstance(bcc_addr, str):
- bcc_addr = [bcc_addr]
- for email in bcc_addr:
- personalization.add_bcc(Bcc(email))
-
- if reply_to:
- mail.reply_to = ReplyTo(reply_to)
-
- mail.add_personalization(personalization)
-
- if categories:
- mail.add_category([Category(x) for x in categories])
-
- if attachment_name and attachment_content:
- attachment = Attachment(
- file_content=FileContent(b64encode(attachment_content).decode()),
- file_name=FileName(attachment_name),
- disposition=Disposition('attachment')
- )
- mail.add_attachment(attachment)
-
- response = client.send(mail)
- if response.status_code not in (200, 201, 202):
- sentry.log_message(
- f'{response.status_code} error response from sendgrid.'
- f'from_addr: {from_addr}\n'
- f'to_addr: {to_addr}\n'
- f'subject: {subject}\n'
- 'mimetype: html\n'
- f'message: {response.body[:30]}\n'
- f'categories: {categories}\n'
- f'attachment_name: {attachment_name}\n'
- )
- else:
- return True
-
-def _content_to_bytes(attachment_content: BytesIO | str | bytes) -> bytes:
- if isinstance(attachment_content, bytes):
- return attachment_content
- elif isinstance(attachment_content, BytesIO):
- return attachment_content.getvalue()
- elif isinstance(attachment_content, str):
- return attachment_content.encode()
- else:
- return str(attachment_content).encode()
diff --git a/notifications.yaml b/notifications.yaml
index 8b3e1fc7ea3..2e8b08ee6f6 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -326,10 +326,14 @@ notification_types:
object_content_type_model_name: collectionsubmission
template: 'website/templates/emails/new_pending_submissions.html.mako'
#### DESK
- - name: desk_archive_job_exceeded
- __docs__: Archive job failed due to size exceeded. Sent to support desk.
+ - name: addon_boa_job_failure
+ __docs__: ...
object_content_type_model_name: desk
- template: 'website/templates/emails/new_pending_submissions.html.mako'
+ template: 'website/templates/emails/addon_boa_job_failure.html.mako'
+ - name: addon_boa_job_complete
+ __docs__: ...
+ object_content_type_model_name: desk
+ template: 'website/templates/emails/addon_boa_job_complete.html.mako'
- name: desk_archive_job_copy_error
__docs__: Archive job failed due to copy error. Sent to support desk.
object_content_type_model_name: desk
diff --git a/osf/email/__init__.py b/osf/email/__init__.py
index 2d35db074c1..9ac0a16e0b4 100644
--- a/osf/email/__init__.py
+++ b/osf/email/__init__.py
@@ -3,7 +3,7 @@
from email.mime.text import MIMEText
import waffle
-from sendgrid import SendGridAPIClient
+from sendgrid import SendGridAPIClient, Personalization, To, Cc, Category, ReplyTo, Bcc
from sendgrid.helpers.mail import Mail
from osf import features
@@ -11,7 +11,7 @@
from django.core.mail import EmailMessage, get_connection
-def send_email_over_smtp(to_addr, notification_type, context):
+def send_email_over_smtp(to_addr, notification_type, context, email_context):
"""Send an email notification using SMTP. This is typically not used in productions as other 3rd party mail services
are preferred. This is to be used for tests and on staging environments and special situations.
@@ -19,6 +19,7 @@ def send_email_over_smtp(to_addr, notification_type, context):
to_addr (str): The recipient's email address.
notification_type (str): The subject of the notification.
context (dict): The email content context.
+ email_context (dict): The email context for sending, such as header changes for BCC or reply-to
"""
if not settings.MAIL_SERVER:
raise NotImplementedError('MAIL_SERVER is not set')
@@ -53,7 +54,7 @@ def send_email_over_smtp(to_addr, notification_type, context):
)
-def send_email_with_send_grid(to_addr, notification_type, context):
+def send_email_with_send_grid(to_addr, notification_type, context, email_context):
"""Send an email notification using SendGrid.
Args:
@@ -70,6 +71,39 @@ def send_email_with_send_grid(to_addr, notification_type, context):
subject=notification_type,
html_content=context.get('message', '')
)
+ in_allowed_list = to_addr in settings.SENDGRID_EMAIL_WHITELIST
+ if settings.SENDGRID_WHITELIST_MODE and not in_allowed_list:
+ from framework.sentry import sentry
+
+ sentry.log_message(
+ f'SENDGRID_WHITELIST_MODE is True. Failed to send emails to non-whitelisted recipient {to_addr}.'
+ )
+ return False
+
+ # Personalization to handle To, CC, and BCC sendgrid client concept
+ personalization = Personalization()
+
+ personalization.add_to(To(to_addr))
+
+ if cc_addr := email_context.get('cc_addr'):
+ if isinstance(cc_addr, str):
+ cc_addr = [cc_addr]
+ for email in cc_addr:
+ personalization.add_cc(Cc(email))
+
+ if bcc_addr := email_context.get('cc_addr'):
+ if isinstance(bcc_addr, str):
+ bcc_addr = [bcc_addr]
+ for email in bcc_addr:
+ personalization.add_bcc(Bcc(email))
+
+ if reply_to := email_context.get('reply_to'):
+ message.reply_to = ReplyTo(reply_to)
+
+ message.add_personalization(personalization)
+
+ if email_categories := email_context.get('email_categories'):
+ message.add_category([Category(x) for x in email_categories])
try:
sg = SendGridAPIClient(settings.SENDGRID_API_KEY)
diff --git a/osf/models/notification.py b/osf/models/notification.py
index 4294eb797eb..1b749af2b9b 100644
--- a/osf/models/notification.py
+++ b/osf/models/notification.py
@@ -18,9 +18,15 @@ class Notification(models.Model):
seen = models.DateTimeField(null=True, blank=True)
created = models.DateTimeField(auto_now_add=True)
- def send(self, protocol_type='email', destination_address=None):
- if not settings.USE_EMAIL:
- return
+ def send(
+ self,
+ protocol_type='email',
+ destination_address=None,
+ email_context=None,
+ ):
+ """
+
+ """
if not protocol_type == 'email':
raise NotImplementedError(f'Protocol type {protocol_type}. Email notifications are only implemented.')
@@ -30,7 +36,8 @@ def send(self, protocol_type='email', destination_address=None):
email.send_email_over_smtp(
recipient_address,
self.subscription.notification_type,
- self.event_context
+ self.event_context,
+ email_context
)
elif protocol_type == 'email' and settings.DEV_MODE:
if not api_settings.CI_ENV:
@@ -39,12 +46,14 @@ def send(self, protocol_type='email', destination_address=None):
f"\nto={recipient_address}"
f"\ntype={self.subscription.notification_type.name}"
f"\ncontext={self.event_context}"
+ f"\nemail_context={self.email_context}"
)
elif protocol_type == 'email':
email.send_email_with_send_grid(
self.subscription.user,
self.subscription.notification_type,
- self.event_context
+ self.event_context,
+ email_context
)
else:
raise NotImplementedError(f'protocol `{protocol_type}` is not supported.')
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 7c651c511b5..66e58281db4 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -26,6 +26,9 @@ class NotificationType(models.Model):
class Type(str, Enum):
# Desk notifications
+ ADDONS_BOA_JOB_FAILURE = 'addon_boa_job_failure'
+ ADDONS_BOA_JOB_COMPLETE = 'addon_boa_job_complete'
+
DESK_REQUEST_EXPORT = 'desk_request_export'
DESK_REQUEST_DEACTIVATION = 'desk_request_deactivation'
DESK_OSF_SUPPORT_EMAIL = 'desk_osf_support_email'
@@ -202,7 +205,8 @@ def emit(
destination_address=None,
subscribed_object=None,
message_frequency='instantly',
- event_context=None
+ event_context=None,
+ email_context=None,
):
"""Emit a notification to a user by creating Notification and NotificationSubscription objects.
@@ -212,6 +216,7 @@ def emit(
subscribed_object (optional): The object the subscription is related to.
message_frequency (optional): Initializing message frequency.
event_context (dict, optional): Context for rendering the notification template.
+ email_context (dict, optional): Context for additional email notification information, so as blind cc etc
"""
from osf.models.notification_subscription import NotificationSubscription
subscription, created = NotificationSubscription.objects.get_or_create(
@@ -225,7 +230,10 @@ def emit(
Notification.objects.create(
subscription=subscription,
event_context=event_context
- ).send(destination_address=destination_address)
+ ).send(
+ destination_address=destination_address,
+ email_context=email_context
+ )
def add_user_to_subscription(self, user, *args, **kwargs):
"""
diff --git a/osf/models/user_message.py b/osf/models/user_message.py
index ac77cefe629..126e4be5bd6 100644
--- a/osf/models/user_message.py
+++ b/osf/models/user_message.py
@@ -3,8 +3,8 @@
from django.db.models.signals import post_save
from django.dispatch import receiver
+from . import NotificationType
from .base import BaseModel, ObjectIDMixin
-from website.mails import send_mail, USER_MESSAGE_INSTITUTIONAL_ACCESS_REQUEST
class MessageTypes(models.TextChoices):
@@ -31,7 +31,7 @@ def get_template(cls: Type['MessageTypes'], message_type: str) -> str:
str: The email template string for the specified message type.
"""
return {
- cls.INSTITUTIONAL_REQUEST: USER_MESSAGE_INSTITUTIONAL_ACCESS_REQUEST
+ cls.INSTITUTIONAL_REQUEST: NotificationType.Type.NODE_INSTITUTIONAL_ACCESS_REQUEST
}[message_type]
@@ -84,18 +84,20 @@ def send_institution_request(self) -> None:
"""
Sends an institutional access request email to the recipient of the message.
"""
- send_mail(
- mail=MessageTypes.get_template(self.message_type),
- to_addr=self.recipient.username,
- bcc_addr=[self.sender.username] if self.is_sender_BCCed else None,
- reply_to=self.sender.username if self.reply_to else None,
+ NotificationType.objects.get(
+ name=MessageTypes.get_template(self.message_type)
+ ).emit(
user=self.recipient,
- **{
+ event_context={
'sender': self.sender,
'recipient': self.recipient,
'message_text': self.message_text,
'institution': self.institution,
},
+ email_context={
+ 'bcc_addr': [self.sender.username] if self.is_sender_BCCed else None,
+ 'reply_to': self.sender.username if self.reply_to else None,
+ }
)
diff --git a/osf_tests/management_commands/test_check_crossref_dois.py b/osf_tests/management_commands/test_check_crossref_dois.py
index 993c7e6731e..802ce4fde0b 100644
--- a/osf_tests/management_commands/test_check_crossref_dois.py
+++ b/osf_tests/management_commands/test_check_crossref_dois.py
@@ -4,6 +4,10 @@
import json
from datetime import timedelta
import responses
+
+from osf.models import NotificationType
+from tests.utils import capture_notifications
+
HERE = os.path.dirname(os.path.abspath(__file__))
@@ -14,7 +18,6 @@
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestCheckCrossrefDOIs:
@pytest.fixture()
@@ -61,7 +64,9 @@ def test_check_crossref_dois(self, crossref_response, stuck_preprint, preprint):
assert stuck_preprint.identifiers.count() == 1
assert stuck_preprint.identifiers.first().value == doi
- def test_report_stuck_dois(self, mock_send_grid, stuck_preprint):
- report_stuck_dois(dry_run=False)
+ def test_report_stuck_dois(self, stuck_preprint):
+ with capture_notifications() as notifications:
+ report_stuck_dois(dry_run=False)
- mock_send_grid.assert_called()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_REQUEST_ACCESS_DENIED
diff --git a/osf_tests/management_commands/test_email_all_users.py b/osf_tests/management_commands/test_email_all_users.py
index 14df656ee52..9141e6b50d4 100644
--- a/osf_tests/management_commands/test_email_all_users.py
+++ b/osf_tests/management_commands/test_email_all_users.py
@@ -2,11 +2,13 @@
from django.utils import timezone
+from osf.models import NotificationType
from osf_tests.factories import UserFactory
from osf.management.commands.email_all_users import email_all_users
+from tests.utils import capture_notifications
+
-@pytest.mark.usefixtures('mock_send_grid')
class TestEmailAllUsers:
@pytest.fixture()
@@ -41,25 +43,29 @@ def unregistered_user(self):
return UserFactory(is_registered=False)
@pytest.mark.django_db
- def test_email_all_users_dry(self, mock_send_grid, superuser):
- email_all_users('TOU_NOTIF', dry_run=True)
-
- mock_send_grid.assert_called()
+ def test_email_all_users_dry(self, superuser):
+ with capture_notifications() as notifications:
+ email_all_users('TOU_NOTIF', dry_run=True)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
@pytest.mark.django_db
def test_dont_email_inactive_users(
- self, mock_send_grid, deleted_user, inactive_user, unconfirmed_user, unregistered_user):
-
- email_all_users('TOU_NOTIF')
+ self, deleted_user, inactive_user, unconfirmed_user, unregistered_user):
- mock_send_grid.assert_not_called()
+ with capture_notifications() as notifications:
+ email_all_users('TOU_NOTIF')
+ assert not notifications
@pytest.mark.django_db
- def test_email_all_users_offset(self, mock_send_grid, user, user2):
- email_all_users('TOU_NOTIF', offset=1, start_id=0)
+ def test_email_all_users_offset(self, user, user2):
+ with capture_notifications() as notifications:
+ email_all_users('TOU_NOTIF', offset=1, start_id=0)
- email_all_users('TOU_NOTIF', offset=1, start_id=1)
+ email_all_users('TOU_NOTIF', offset=1, start_id=1)
- email_all_users('TOU_NOTIF', offset=1, start_id=2)
+ email_all_users('TOU_NOTIF', offset=1, start_id=2)
- assert mock_send_grid.call_count == 2
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[1]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
diff --git a/osf_tests/test_archiver.py b/osf_tests/test_archiver.py
index 65ebc719789..f653b20ea25 100644
--- a/osf_tests/test_archiver.py
+++ b/osf_tests/test_archiver.py
@@ -22,7 +22,7 @@
from website.archiver import listeners
from website.archiver.tasks import * # noqa: F403
-from osf.models import Guid, RegistrationSchema, Registration
+from osf.models import Guid, RegistrationSchema, Registration, NotificationType
from osf.models.archive import ArchiveTarget, ArchiveJob
from osf.models.base import generate_object_id
from osf.utils.migrations import map_schema_to_schemablocks
@@ -32,8 +32,7 @@
from osf_tests import factories
from tests.base import OsfTestCase, fake
from tests import utils as test_utils
-from tests.utils import unique as _unique
-from conftest import start_mock_send_grid
+from tests.utils import unique as _unique, capture_notifications
pytestmark = pytest.mark.django_db
@@ -721,45 +720,49 @@ def test_archive_success_same_file_in_component(self):
assert child_reg._id in question['extra'][0]['viewUrl']
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class TestArchiverUtils(ArchiverTestCase):
- def setUp(self):
- super().setUp()
- self.mock_send_grid = start_mock_send_grid(self)
-
def test_handle_archive_fail(self):
- archiver_utils.handle_archive_fail(
- ARCHIVER_NETWORK_ERROR,
- self.src,
- self.dst,
- self.user,
- {}
- )
- assert self.mock_send_grid.call_count == 2
+ with capture_notifications() as notifications:
+ archiver_utils.handle_archive_fail(
+ ARCHIVER_NETWORK_ERROR,
+ self.src,
+ self.dst,
+ self.user,
+ {}
+ )
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[1]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
self.dst.reload()
assert self.dst.is_deleted
def test_handle_archive_fail_copy(self):
- archiver_utils.handle_archive_fail(
- ARCHIVER_NETWORK_ERROR,
- self.src,
- self.dst,
- self.user,
- {}
- )
- assert self.mock_send_grid.call_count == 2
+ with capture_notifications() as notifications:
+ archiver_utils.handle_archive_fail(
+ ARCHIVER_NETWORK_ERROR,
+ self.src,
+ self.dst,
+ self.user,
+ {}
+ )
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[1]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
def test_handle_archive_fail_size(self):
- archiver_utils.handle_archive_fail(
- ARCHIVER_SIZE_EXCEEDED,
- self.src,
- self.dst,
- self.user,
- {}
- )
- assert self.mock_send_grid.call_count == 2
+ with capture_notifications() as notifications:
+ archiver_utils.handle_archive_fail(
+ ARCHIVER_SIZE_EXCEEDED,
+ self.src,
+ self.dst,
+ self.user,
+ {}
+ )
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[1]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
def test_aggregate_file_tree_metadata(self):
a_stat_result = archiver_utils.aggregate_file_tree_metadata('dropbox', FILE_TREE, self.user)
@@ -846,14 +849,9 @@ def test_get_file_map_memoization(self):
archiver_utils.get_file_map(node)
assert mock_get_file_tree.call_count == call_count
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class TestArchiverListeners(ArchiverTestCase):
- def setUp(self):
- super().setUp()
- self.mock_send_grid = start_mock_send_grid(self)
-
@mock.patch('website.archiver.tasks.archive')
@mock.patch('website.archiver.utils.before_archive')
def test_after_register(self, mock_before_archive, mock_archive):
@@ -905,8 +903,9 @@ def test_archive_callback_pending(self, mock_delay):
)
self.dst.archive_job.save()
with mock.patch('website.archiver.utils.handle_archive_fail') as mock_fail:
- listeners.archive_callback(self.dst)
- assert not self.mock_send_grid.called
+ with capture_notifications() as notifications:
+ listeners.archive_callback(self.dst)
+ assert not notifications
assert not mock_fail.called
assert mock_delay.called
@@ -914,8 +913,9 @@ def test_archive_callback_pending(self, mock_delay):
def test_archive_callback_done_success(self, mock_archive_success):
self.dst.archive_job.update_target('osfstorage', ARCHIVER_SUCCESS)
self.dst.archive_job.save()
- listeners.archive_callback(self.dst)
- assert self.mock_send_grid.call_count == 0
+ with capture_notifications() as notifications:
+ listeners.archive_callback(self.dst)
+ assert not notifications
@mock.patch('website.archiver.tasks.archive_success.delay')
def test_archive_callback_done_embargoed(self, mock_archive_success):
@@ -929,8 +929,9 @@ def test_archive_callback_done_embargoed(self, mock_archive_success):
self.dst.embargo_registration(self.user, end_date)
self.dst.archive_job.update_target('osfstorage', ARCHIVER_SUCCESS)
self.dst.save()
- listeners.archive_callback(self.dst)
- assert self.mock_send_grid.call_count == 0
+ with capture_notifications() as notifications:
+ listeners.archive_callback(self.dst)
+ assert not notifications
def test_archive_callback_done_errors(self):
self.dst.archive_job.update_target('osfstorage', ARCHIVER_FAILURE)
@@ -1021,16 +1022,19 @@ def test_archive_callback_on_tree_sends_only_one_email(self, mock_arhive_success
node.archive_job.update_target('osfstorage', ARCHIVER_INITIATED)
rchild.archive_job.update_target('osfstorage', ARCHIVER_SUCCESS)
rchild.save()
- listeners.archive_callback(rchild)
- assert not self.mock_send_grid.called
+ with capture_notifications() as notifications:
+ listeners.archive_callback(rchild)
+ assert not notifications
reg.archive_job.update_target('osfstorage', ARCHIVER_SUCCESS)
reg.save()
- listeners.archive_callback(reg)
- assert not self.mock_send_grid.called
+ with capture_notifications() as notifications:
+ listeners.archive_callback(reg)
+ assert not notifications
rchild2.archive_job.update_target('osfstorage', ARCHIVER_SUCCESS)
rchild2.save()
- listeners.archive_callback(rchild2)
- assert not self.mock_send_grid.called
+ with capture_notifications() as notifications:
+ listeners.archive_callback(rchild2)
+ assert not notifications
class TestArchiverScripts(ArchiverTestCase):
@@ -1078,14 +1082,9 @@ def test_find_failed_registrations(self):
assert pk not in failed
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class TestArchiverBehavior(OsfTestCase):
- def setUp(self):
- super().setUp()
- self.mock_send_grid = start_mock_send_grid(self)
-
@mock.patch('osf.models.AbstractNode.update_search')
def test_archiving_registrations_not_added_to_search_before_archival(self, mock_update_search):
proj = factories.ProjectFactory()
diff --git a/osf_tests/test_collection.py b/osf_tests/test_collection.py
index c28dea3eb99..0e39c011f65 100644
--- a/osf_tests/test_collection.py
+++ b/osf_tests/test_collection.py
@@ -5,8 +5,9 @@
from framework.auth import Auth
-from osf.models import Collection
+from osf.models import Collection, NotificationType
from osf.exceptions import NodeStateError
+from tests.utils import capture_notifications
from website.views import find_bookmark_collection
from .factories import (
UserFactory,
@@ -71,7 +72,6 @@ def test_can_remove_root_folder_structure_without_cascading(self, user, auth):
@pytest.mark.enable_bookmark_creation
-@pytest.mark.usefixtures('mock_send_grid')
class TestImplicitRemoval:
@pytest.fixture
@@ -126,22 +126,23 @@ def test_node_removed_from_collection_on_privacy_change(self, auth, collected_no
assert associated_collections.filter(collection=bookmark_collection).exists()
@mock.patch('osf.models.node.Node.check_privacy_change_viability', mock.Mock()) # mocks the storage usage limits
- def test_node_removed_from_collection_on_privacy_change_notify(self, auth, provider_collected_node, bookmark_collection, mock_send_grid):
+ def test_node_removed_from_collection_on_privacy_change_notify(self, auth, provider_collected_node, bookmark_collection):
associated_collections = provider_collected_node.guids.first().collectionsubmission_set
assert associated_collections.count() == 3
- mock_send_grid.reset_mock()
- provider_collected_node.set_privacy('private', auth=auth)
- assert mock_send_grid.called
- assert len(mock_send_grid.call_args_list) == 1
+ with capture_notifications() as notifications:
+ provider_collected_node.set_privacy('private', auth=auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
@mock.patch('osf.models.node.Node.check_privacy_change_viability', mock.Mock()) # mocks the storage usage limits
- def test_node_removed_from_collection_on_privacy_change_no_provider(self, auth, collected_node, bookmark_collection, mock_send_grid):
+ def test_node_removed_from_collection_on_privacy_change_no_provider(self, auth, collected_node, bookmark_collection):
associated_collections = collected_node.guids.first().collectionsubmission_set
assert associated_collections.count() == 3
- collected_node.set_privacy('private', auth=auth)
- assert not mock_send_grid.called
+ with capture_notifications() as notifications:
+ collected_node.set_privacy('private', auth=auth)
+ assert not notifications
def test_node_removed_from_collection_on_delete(self, collected_node, bookmark_collection, auth):
associated_collections = collected_node.guids.first().collectionsubmission_set
diff --git a/osf_tests/test_collection_submission.py b/osf_tests/test_collection_submission.py
index 76baa2de752..d2dd906b692 100644
--- a/osf_tests/test_collection_submission.py
+++ b/osf_tests/test_collection_submission.py
@@ -303,7 +303,6 @@ def test_cancel_succeeds(self, node, moderated_collection_submission):
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestUnmoderatedCollectionSubmission:
def test_moderated_submit(self, unmoderated_collection_submission):
@@ -386,7 +385,6 @@ def test_cancel_succeeds(self, node, unmoderated_collection_submission):
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestHybridModeratedCollectionSubmission:
@pytest.mark.parametrize('user_role', UserRoles.excluding(UserRoles.MODERATOR))
diff --git a/osf_tests/test_institution.py b/osf_tests/test_institution.py
index eca6737b6e5..d4442ad8590 100644
--- a/osf_tests/test_institution.py
+++ b/osf_tests/test_institution.py
@@ -4,7 +4,7 @@
import pytest
from addons.osfstorage.models import Region
-from osf.models import Institution, InstitutionStorageRegion
+from osf.models import Institution, InstitutionStorageRegion, NotificationType
from osf_tests.factories import (
AuthUserFactory,
InstitutionFactory,
@@ -12,6 +12,7 @@
RegionFactory,
UserFactory,
)
+from tests.utils import capture_notifications
@pytest.mark.django_db
@@ -109,7 +110,6 @@ def test_non_group_member_doesnt_have_perms(self, institution, user):
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestInstitutionManager:
def test_deactivated_institution_not_in_default_queryset(self):
@@ -146,7 +146,7 @@ def test_reactivate_institution(self):
institution.reactivate()
assert institution.deactivated is None
- def test_send_deactivation_email_call_count(self, mock_send_grid):
+ def test_send_deactivation_email_call_count(self):
institution = InstitutionFactory()
user_1 = UserFactory()
user_1.add_or_update_affiliated_institution(institution)
@@ -154,16 +154,21 @@ def test_send_deactivation_email_call_count(self, mock_send_grid):
user_2 = UserFactory()
user_2.add_or_update_affiliated_institution(institution)
user_2.save()
- institution._send_deactivation_email()
- assert mock_send_grid.call_count == 2
+ with capture_notifications() as notifications:
+ institution._send_deactivation_email()
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.NODE_REQUEST_ACCESS_DENIED
+ assert notifications[1]['type'] == NotificationType.Type.NODE_REQUEST_ACCESS_DENIED
- def test_send_deactivation_email_call_args(self, mock_send_grid):
+ def test_send_deactivation_email_call_args(self):
institution = InstitutionFactory()
user = UserFactory()
user.add_or_update_affiliated_institution(institution)
user.save()
- institution._send_deactivation_email()
- mock_send_grid.assert_called()
+ with capture_notifications() as notifications:
+ institution._send_deactivation_email()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_REQUEST_ACCESS_DENIED
def test_deactivate_inactive_institution_noop(self):
institution = InstitutionFactory()
diff --git a/osf_tests/test_merging_users.py b/osf_tests/test_merging_users.py
index ee13c7bc107..2a7400bd40d 100644
--- a/osf_tests/test_merging_users.py
+++ b/osf_tests/test_merging_users.py
@@ -21,17 +21,15 @@
from importlib import import_module
from django.conf import settings as django_conf_settings
from osf.models import UserSessionMap
-from tests.utils import run_celery_tasks
+from tests.utils import run_celery_tasks, capture_notifications
from waffle.testutils import override_flag
from osf.features import ENABLE_GV
-from conftest import start_mock_send_grid
SessionStore = import_module(django_conf_settings.SESSION_ENGINE).SessionStore
@pytest.mark.enable_implicit_clean
@pytest.mark.enable_bookmark_creation
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class TestUserMerging(OsfTestCase):
def setUp(self):
@@ -39,7 +37,6 @@ def setUp(self):
self.user = UserFactory()
with self.context:
handlers.celery_before_request()
- self.mock_send_grid = start_mock_send_grid(self)
def _add_unconfirmed_user(self):
self.unconfirmed = UnconfirmedUserFactory()
@@ -294,7 +291,8 @@ def test_merge_doesnt_send_signal(self):
#Explictly reconnect signal as it is disconnected by default for test
contributor_added.connect(notify_added_contributor)
other_user = UserFactory()
- with override_flag(ENABLE_GV, active=True):
- self.user.merge_user(other_user)
+ with capture_notifications() as notifications:
+ with override_flag(ENABLE_GV, active=True):
+ self.user.merge_user(other_user)
+ assert not notifications
assert other_user.merged_by._id == self.user._id
- assert self.mock_send_grid.called is False
diff --git a/osf_tests/test_queued_mail.py b/osf_tests/test_queued_mail.py
deleted file mode 100644
index 395b770a61d..00000000000
--- a/osf_tests/test_queued_mail.py
+++ /dev/null
@@ -1,155 +0,0 @@
-# Ported from tests.test_mails
-import datetime as dt
-
-
-import pytest
-from django.utils import timezone
-from waffle.testutils import override_switch
-
-from .factories import UserFactory, NodeFactory
-
-from osf.features import DISABLE_ENGAGEMENT_EMAILS
-from osf.models.queued_mail import (
- queue_mail, WELCOME_OSF4M,
- NO_LOGIN, NO_ADDON, NEW_PUBLIC_PROJECT
-)
-from website.mails import mails
-from website.settings import DOMAIN
-
-@pytest.fixture()
-def user():
- return UserFactory(is_registered=True)
-
-@pytest.mark.django_db
-class TestQueuedMail:
-
- def queue_mail(self, mail, user, send_at=None, **kwargs):
- mail = queue_mail(
- to_addr=user.username if user else user.username,
- send_at=send_at or timezone.now(),
- user=user,
- mail=mail,
- fullname=user.fullname if user else user.username,
- **kwargs
- )
- return mail
-
- def test_no_login_presend_for_active_user(self, user):
- mail = self.queue_mail(mail=NO_LOGIN, user=user)
- user.date_last_login = timezone.now() + dt.timedelta(seconds=10)
- user.save()
- assert mail.send_mail() is False
-
- def test_no_login_presend_for_inactive_user(self, user):
- mail = self.queue_mail(mail=NO_LOGIN, user=user)
- user.date_last_login = timezone.now() - dt.timedelta(weeks=10)
- user.save()
- assert timezone.now() - dt.timedelta(days=1) > user.date_last_login
- assert bool(mail.send_mail()) is True
-
- def test_no_addon_presend(self, user):
- mail = self.queue_mail(mail=NO_ADDON, user=user)
- assert mail.send_mail() is True
-
- def test_new_public_project_presend_for_no_project(self, user):
- mail = self.queue_mail(
- mail=NEW_PUBLIC_PROJECT,
- user=user,
- project_title='Oh noes',
- nid='',
- )
- assert bool(mail.send_mail()) is False
-
- def test_new_public_project_presend_success(self, user):
- node = NodeFactory(is_public=True)
- mail = self.queue_mail(
- mail=NEW_PUBLIC_PROJECT,
- user=user,
- project_title='Oh yass',
- nid=node._id
- )
- assert bool(mail.send_mail()) is True
-
- def test_welcome_osf4m_presend(self, user):
- user.date_last_login = timezone.now() - dt.timedelta(days=13)
- user.save()
- mail = self.queue_mail(
- mail=WELCOME_OSF4M,
- user=user,
- conference='Buttjamz conference',
- fid='',
- domain=DOMAIN
- )
- assert bool(mail.send_mail()) is True
- assert mail.data['downloads'] == 0
-
- def test_finding_other_emails_sent_to_user(self, user):
- mail = self.queue_mail(
- user=user,
- mail=NO_ADDON,
- )
- assert len(mail.find_sent_of_same_type_and_user()) == 0
- mail.send_mail()
- assert len(mail.find_sent_of_same_type_and_user()) == 1
-
- def test_user_is_active(self, user):
- mail = self.queue_mail(
- user=user,
- mail=NO_ADDON,
- )
- assert bool(mail.send_mail()) is True
-
- def test_user_is_not_active_no_password(self):
- user = UserFactory.build()
- user.set_unusable_password()
- user.save()
- mail = self.queue_mail(
- user=user,
- mail=NO_ADDON,
- )
- assert mail.send_mail() is False
-
- def test_user_is_not_active_not_registered(self):
- user = UserFactory(is_registered=False)
- mail = self.queue_mail(
- user=user,
- mail=NO_ADDON,
- )
- assert mail.send_mail() is False
-
- def test_user_is_not_active_is_merged(self):
- other_user = UserFactory()
- user = UserFactory(merged_by=other_user)
- mail = self.queue_mail(
- user=user,
- mail=NO_ADDON,
- )
- assert mail.send_mail() is False
-
- def test_user_is_not_active_is_disabled(self):
- user = UserFactory(date_disabled=timezone.now())
- mail = self.queue_mail(
- user=user,
- mail=NO_ADDON,
- )
- assert mail.send_mail() is False
-
- def test_user_is_not_active_is_not_confirmed(self):
- user = UserFactory(date_confirmed=None)
- mail = self.queue_mail(
- user=user,
- mail=NO_ADDON,
- )
- assert mail.send_mail() is False
-
- def test_disabled_queued_emails_not_sent_if_switch_active(self, user):
- with override_switch(DISABLE_ENGAGEMENT_EMAILS, active=True):
- assert self.queue_mail(mail=NO_ADDON, user=user) is False
- assert self.queue_mail(mail=NO_LOGIN, user=user) is False
- assert self.queue_mail(mail=WELCOME_OSF4M, user=user) is False
- assert self.queue_mail(mail=NEW_PUBLIC_PROJECT, user=user) is False
-
- def test_disabled_triggered_emails_not_sent_if_switch_active(self):
- with override_switch(DISABLE_ENGAGEMENT_EMAILS, active=True):
- assert mails.send_mail(to_addr='', mail=mails.WELCOME) is False
- assert mails.send_mail(to_addr='', mail=mails.WELCOME_OSF4I) is False
diff --git a/osf_tests/test_sanctions.py b/osf_tests/test_sanctions.py
index de4161ced4a..54530dc1324 100644
--- a/osf_tests/test_sanctions.py
+++ b/osf_tests/test_sanctions.py
@@ -135,7 +135,6 @@ def registration(self, request, contributor):
registration.save()
return registration
- @mock.patch('website.mails.settings.USE_EMAIL', False)
@pytest.mark.parametrize('reviews_workflow', [None, 'pre-moderation'])
@pytest.mark.parametrize('branched_from_node', [True, False])
def test_render_admin_emails(self, registration, reviews_workflow, branched_from_node):
@@ -149,7 +148,6 @@ def test_render_admin_emails(self, registration, reviews_workflow, branched_from
registration.sanction.ask([(registration.creator, registration)])
assert True # mail rendered successfully
- @mock.patch('website.mails.settings.USE_EMAIL', False)
@pytest.mark.parametrize('reviews_workflow', [None, 'pre-moderation'])
@pytest.mark.parametrize('branched_from_node', [True, False])
def test_render_non_admin_emails(
diff --git a/osf_tests/test_schema_responses.py b/osf_tests/test_schema_responses.py
index 40965c7cf31..3b3af1458cf 100644
--- a/osf_tests/test_schema_responses.py
+++ b/osf_tests/test_schema_responses.py
@@ -4,11 +4,12 @@
from api.providers.workflows import Workflows
from framework.exceptions import PermissionsError
from osf.exceptions import PreviousSchemaResponseError, SchemaResponseStateError, SchemaResponseUpdateError
-from osf.models import RegistrationSchema, RegistrationSchemaBlock, SchemaResponseBlock
+from osf.models import RegistrationSchema, RegistrationSchemaBlock, SchemaResponseBlock, NotificationType
from osf.models import schema_response # import module for mocking purposes
from osf.utils.workflows import ApprovalStates, SchemaResponseTriggers
from osf_tests.factories import AuthUserFactory, ProjectFactory, RegistrationFactory, RegistrationProviderFactory
from osf_tests.utils import get_default_test_schema, _ensure_subscriptions
+from tests.utils import capture_notifications
from website.notifications import emails
@@ -95,7 +96,6 @@ def revised_response(initial_response):
@pytest.mark.enable_bookmark_creation
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestCreateSchemaResponse():
def test_create_initial_response_sets_attributes(self, registration, schema):
@@ -142,11 +142,12 @@ def test_create_initial_response_assigns_default_values(self, registration):
for block in response.response_blocks.all():
assert block.response == DEFAULT_SCHEMA_RESPONSE_VALUES[block.schema_key]
- def test_create_initial_response_does_not_notify(self, registration, admin_user, mock_send_grid):
- schema_response.SchemaResponse.create_initial_response(
- parent=registration, initiator=admin_user
- )
- assert not mock_send_grid.called
+ def test_create_initial_response_does_not_notify(self, registration, admin_user):
+ with capture_notifications() as notifications:
+ schema_response.SchemaResponse.create_initial_response(
+ parent=registration, initiator=admin_user
+ )
+ assert not notifications
def test_create_initial_response_fails_if_no_schema_and_no_parent_schema(self, registration):
registration.registered_schema.clear()
@@ -252,13 +253,14 @@ def test_create_from_previous_response(self, registration, initial_response):
assert set(revised_response.response_blocks.all()) == set(initial_response.response_blocks.all())
def test_create_from_previous_response_notification(
- self, initial_response, admin_user, notification_recipients, mock_send_grid):
-
- schema_response.SchemaResponse.create_from_previous_response(
- previous_response=initial_response, initiator=admin_user
- )
+ self, initial_response, admin_user, notification_recipients):
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ schema_response.SchemaResponse.create_from_previous_response(
+ previous_response=initial_response, initiator=admin_user
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
@pytest.mark.parametrize(
'invalid_response_state',
@@ -542,7 +544,6 @@ def test_delete_fails_if_state_is_invalid(self, invalid_response_state, initial_
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestUnmoderatedSchemaResponseApprovalFlows():
def test_submit_response_adds_pending_approvers(
@@ -574,23 +575,25 @@ def test_submit_response_writes_schema_response_action(self, initial_response, a
assert new_action.trigger == SchemaResponseTriggers.SUBMIT.db_name
def test_submit_response_notification(
- self, revised_response, admin_user, notification_recipients, mock_send_grid):
+ self, revised_response, admin_user, notification_recipients):
revised_response.approvals_state_machine.set_state(ApprovalStates.IN_PROGRESS)
revised_response.update_responses({'q1': 'must change one response or can\'t submit'})
revised_response.revision_justification = 'has for valid revision_justification for submission'
revised_response.save()
- revised_response.submit(user=admin_user, required_approvers=[admin_user])
-
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ revised_response.submit(user=admin_user, required_approvers=[admin_user])
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
- def test_no_submit_notification_on_initial_response(self, initial_response, admin_user, mock_send_grid):
+ def test_no_submit_notification_on_initial_response(self, initial_response, admin_user):
initial_response.approvals_state_machine.set_state(ApprovalStates.IN_PROGRESS)
initial_response.update_responses({'q1': 'must change one response or can\'t submit'})
initial_response.revision_justification = 'has for valid revision_justification for submission'
initial_response.save()
- initial_response.submit(user=admin_user, required_approvers=[admin_user])
- assert not mock_send_grid.called
+ with capture_notifications() as notifications:
+ initial_response.submit(user=admin_user, required_approvers=[admin_user])
+ assert not notifications
def test_submit_response_requires_user(self, initial_response, admin_user):
initial_response.approvals_state_machine.set_state(ApprovalStates.IN_PROGRESS)
@@ -672,23 +675,26 @@ def test_approve_response_writes_schema_response_action(
).count() == 2
def test_approve_response_notification(
- self, revised_response, admin_user, alternate_user, notification_recipients, mock_send_grid):
+ self, revised_response, admin_user, alternate_user, notification_recipients):
revised_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
revised_response.save()
revised_response.pending_approvers.add(admin_user, alternate_user)
- mock_send_grid.reset_mock()
- revised_response.approve(user=admin_user)
- assert not mock_send_grid.called # Should only send email on final approval
- revised_response.approve(user=alternate_user)
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ revised_response.approve(user=admin_user)
+ assert not notifications # Should only send email on final approval
+ with capture_notifications() as notifications:
+ revised_response.approve(user=alternate_user)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
- def test_no_approve_notification_on_initial_response(self, initial_response, admin_user, mock_send_grid):
+ def test_no_approve_notification_on_initial_response(self, initial_response, admin_user):
initial_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
initial_response.save()
initial_response.pending_approvers.add(admin_user)
- initial_response.approve(user=admin_user)
- assert not mock_send_grid.called
+ with capture_notifications() as notifications:
+ initial_response.approve(user=admin_user)
+ assert not notifications
def test_approve_response_requires_user(self, initial_response, admin_user):
initial_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
@@ -739,22 +745,24 @@ def test_reject_response_writes_schema_response_action(self, initial_response, a
assert new_action.trigger == SchemaResponseTriggers.ADMIN_REJECT.db_name
def test_reject_response_notification(
- self, revised_response, admin_user, notification_recipients, mock_send_grid):
+ self, revised_response, admin_user, notification_recipients):
revised_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
revised_response.save()
revised_response.pending_approvers.add(admin_user)
- revised_response.reject(user=admin_user)
-
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ revised_response.reject(user=admin_user)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
- def test_no_reject_notification_on_initial_response(self, initial_response, admin_user, mock_send_grid):
+ def test_no_reject_notification_on_initial_response(self, initial_response, admin_user):
initial_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
initial_response.save()
initial_response.pending_approvers.add(admin_user)
- initial_response.reject(user=admin_user)
- assert not mock_send_grid.called
+ with capture_notifications() as notifications:
+ initial_response.reject(user=admin_user)
+ assert not notifications
def test_reject_response_requires_user(self, initial_response, admin_user):
initial_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
@@ -801,7 +809,6 @@ def test_internal_accept_clears_pending_approvers(self, initial_response, admin_
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestModeratedSchemaResponseApprovalFlows():
@pytest.fixture
@@ -848,13 +855,15 @@ def test_schema_response_action_to_state_following_moderated_approve_is_pending_
assert new_action.to_state == ApprovalStates.PENDING_MODERATION.db_name
assert new_action.trigger == SchemaResponseTriggers.APPROVE.db_name
- def test_accept_notification_sent_on_admin_approval(self, revised_response, admin_user, mock_send_grid):
+ def test_accept_notification_sent_on_admin_approval(self, revised_response, admin_user):
revised_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
revised_response.save()
revised_response.pending_approvers.add(admin_user)
- revised_response.approve(user=admin_user)
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ revised_response.approve(user=admin_user)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
def test_moderators_notified_on_admin_approval(self, revised_response, admin_user, moderator):
revised_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
@@ -900,21 +909,23 @@ def test_moderator_accept_writes_schema_response_action(self, initial_response,
assert new_action.trigger == SchemaResponseTriggers.ACCEPT.db_name
def test_moderator_accept_notification(
- self, revised_response, moderator, notification_recipients, mock_send_grid):
+ self, revised_response, moderator, notification_recipients):
revised_response.approvals_state_machine.set_state(ApprovalStates.PENDING_MODERATION)
revised_response.save()
- revised_response.accept(user=moderator)
-
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ revised_response.accept(user=moderator)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
def test_no_moderator_accept_notification_on_initial_response(
- self, initial_response, moderator, mock_send_grid):
+ self, initial_response, moderator):
initial_response.approvals_state_machine.set_state(ApprovalStates.PENDING_MODERATION)
initial_response.save()
- initial_response.accept(user=moderator)
- assert not mock_send_grid.called
+ with capture_notifications() as notifications:
+ initial_response.accept(user=moderator)
+ assert not notifications
def test_moderator_reject(self, initial_response, admin_user, moderator):
initial_response.approvals_state_machine.set_state(ApprovalStates.PENDING_MODERATION)
@@ -938,21 +949,23 @@ def test_moderator_reject_writes_schema_response_action(
assert new_action.trigger == SchemaResponseTriggers.MODERATOR_REJECT.db_name
def test_moderator_reject_notification(
- self, revised_response, moderator, notification_recipients, mock_send_grid):
+ self, revised_response, moderator, notification_recipients):
revised_response.approvals_state_machine.set_state(ApprovalStates.PENDING_MODERATION)
revised_response.save()
- revised_response.reject(user=moderator)
-
- assert mock_send_grid.called
+ with capture_notifications() as notifications:
+ revised_response.reject(user=moderator)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
def test_no_moderator_reject_notification_on_initial_response(
- self, initial_response, moderator, mock_send_grid):
+ self, initial_response, moderator):
initial_response.approvals_state_machine.set_state(ApprovalStates.PENDING_MODERATION)
initial_response.save()
- initial_response.reject(user=moderator)
- assert not mock_send_grid.called
+ with capture_notifications() as notifications:
+ initial_response.reject(user=moderator)
+ assert not notifications
def test_moderator_cannot_submit(self, initial_response, moderator):
initial_response.approvals_state_machine.set_state(ApprovalStates.IN_PROGRESS)
diff --git a/osf_tests/test_user.py b/osf_tests/test_user.py
index 70d3a7ceb17..8a8a6f29d72 100644
--- a/osf_tests/test_user.py
+++ b/osf_tests/test_user.py
@@ -18,6 +18,7 @@
from framework.auth.signals import user_account_merged
from framework.analytics import get_total_activity_count
from framework.exceptions import PermissionsError
+from tests.utils import capture_notifications
from website import settings
from website import filters
from website.views import find_bookmark_collection
@@ -32,7 +33,7 @@
DraftRegistrationContributor,
DraftRegistration,
DraftNode,
- UserSessionMap,
+ UserSessionMap, NotificationType,
)
from osf.models.institution_affiliation import get_user_by_institution_identity
from addons.github.tests.factories import GitHubAccountFactory
@@ -885,8 +886,6 @@ def test_get_user_by_cookie_no_session(self):
assert OSFUser.from_cookie(cookie) is None
-@pytest.mark.usefixtures('mock_send_grid')
-@pytest.mark.usefixtures('mock_notification_send')
class TestChangePassword:
def test_change_password(self, user):
@@ -898,19 +897,23 @@ def test_change_password(self, user):
user.change_password(old_password, new_password, confirm_password)
assert bool(user.check_password(new_password)) is True
- def test_set_password_notify_default(self, mock_notification_send, user):
+ def test_set_password_notify_default(self, user):
old_password = 'password'
- user.set_password(old_password)
- user.save()
- assert mock_notification_send.called is True
+ with capture_notifications() as notifications:
+ user.set_password(old_password)
+ user.save()
- def test_set_password_no_notify(self, mock_notification_send, user):
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PASSWORD_CHANGED
+
+ def test_set_password_no_notify(self, user):
old_password = 'password'
- user.set_password(old_password, notify=False)
- user.save()
- assert mock_notification_send.called is False
+ with capture_notifications() as notifications:
+ user.set_password(old_password, notify=False)
+ user.save()
+ assert not notifications
- def test_check_password_upgrade_hasher_no_notify(self, mock_notification_send, user, settings):
+ def test_check_password_upgrade_hasher_no_notify(self, user, settings):
# NOTE: settings fixture comes from pytest-django.
# changes get reverted after tests run
settings.PASSWORD_HASHERS = (
@@ -919,9 +922,10 @@ def test_check_password_upgrade_hasher_no_notify(self, mock_notification_send, u
)
raw_password = 'password'
user.password = 'sha1$lNb72DKWDv6P$e6ae16dada9303ae0084e14fc96659da4332bb05'
- user.check_password(raw_password)
+ with capture_notifications() as notifications:
+ user.check_password(raw_password)
+ assert not notifications
assert user.password.startswith('md5$')
- assert mock_notification_send.called is False
def test_change_password_invalid(self, old_password=None, new_password=None, confirm_password=None,
error_message='Old password is invalid'):
diff --git a/scripts/tests/test_deactivate_requested_accounts.py b/scripts/tests/test_deactivate_requested_accounts.py
index 07e43f74bf2..1b0da9a89b5 100644
--- a/scripts/tests/test_deactivate_requested_accounts.py
+++ b/scripts/tests/test_deactivate_requested_accounts.py
@@ -1,12 +1,13 @@
import pytest
+from osf.models import NotificationType
from osf_tests.factories import ProjectFactory, AuthUserFactory
from osf.management.commands.deactivate_requested_accounts import deactivate_requested_accounts
+from tests.utils import capture_notifications
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
class TestDeactivateRequestedAccount:
@pytest.fixture()
@@ -24,21 +25,25 @@ def user_requested_deactivation_with_node(self):
user.save()
return user
- def test_deactivate_user_with_no_content(self, mock_send_grid, user_requested_deactivation):
+ def test_deactivate_user_with_no_content(self, user_requested_deactivation):
- deactivate_requested_accounts(dry_run=False)
+ with capture_notifications() as notifications:
+ deactivate_requested_accounts(dry_run=False)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.DESK_REQUEST_DEACTIVATION
user_requested_deactivation.reload()
assert user_requested_deactivation.requested_deactivation
assert user_requested_deactivation.contacted_deactivation
assert user_requested_deactivation.is_disabled
- mock_send_grid.assert_called()
- def test_deactivate_user_with_content(self, mock_send_grid, user_requested_deactivation_with_node):
+ def test_deactivate_user_with_content(self, user_requested_deactivation_with_node):
- deactivate_requested_accounts(dry_run=False)
+ with capture_notifications() as notifications:
+ deactivate_requested_accounts(dry_run=False)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.DESK_REQUEST_DEACTIVATION
user_requested_deactivation_with_node.reload()
assert user_requested_deactivation_with_node.requested_deactivation
assert not user_requested_deactivation_with_node.is_disabled
- mock_send_grid.assert_called()
diff --git a/scripts/tests/test_send_queued_mails.py b/scripts/tests/test_send_queued_mails.py
deleted file mode 100644
index 2815b85f5d9..00000000000
--- a/scripts/tests/test_send_queued_mails.py
+++ /dev/null
@@ -1,84 +0,0 @@
-from unittest import mock
-from datetime import timedelta
-
-from django.utils import timezone
-
-from tests.base import OsfTestCase
-from osf_tests.factories import UserFactory
-from osf.models.queued_mail import QueuedMail, queue_mail, NO_ADDON, NO_LOGIN_TYPE
-
-from scripts.send_queued_mails import main, pop_and_verify_mails_for_each_user, find_queued_mails_ready_to_be_sent
-from website import settings
-
-@mock.patch('website.mails.settings.USE_EMAIL', True)
-@mock.patch('website.mails.settings.USE_CELERY', False)
-class TestSendQueuedMails(OsfTestCase):
-
- def setUp(self):
- super().setUp()
- self.user = UserFactory()
- self.user.date_last_login = timezone.now()
- self.user.osf_mailing_lists[settings.OSF_HELP_LIST] = True
- self.user.save()
-
- from conftest import start_mock_send_grid
- self.mock_send_grid = start_mock_send_grid(self)
-
-
- def queue_mail(self, mail_type=NO_ADDON, user=None, send_at=None):
- return queue_mail(
- to_addr=user.username if user else self.user.username,
- mail=mail_type,
- send_at=send_at or timezone.now(),
- user=user if user else self.user,
- fullname=user.fullname if user else self.user.fullname,
- )
-
- def test_queue_addon_mail(self):
- self.queue_mail()
- main(dry_run=False)
- assert self.mock_send_grid.called
-
- def test_no_two_emails_to_same_person(self):
- user = UserFactory()
- user.osf_mailing_lists[settings.OSF_HELP_LIST] = True
- user.save()
- self.queue_mail(user=user)
- self.queue_mail(user=user)
- main(dry_run=False)
- assert self.mock_send_grid.call_count == 1
-
- def test_pop_and_verify_mails_for_each_user(self):
- user_with_email_sent = UserFactory()
- user_with_multiple_emails = UserFactory()
- user_with_no_emails_sent = UserFactory()
- time = timezone.now() - timedelta(days=1)
- mail_sent = QueuedMail(
- user=user_with_email_sent,
- send_at=time,
- to_addr=user_with_email_sent.username,
- email_type=NO_LOGIN_TYPE
- )
- mail_sent.save()
- mail1 = self.queue_mail(user=user_with_email_sent)
- mail2 = self.queue_mail(user=user_with_multiple_emails)
- mail3 = self.queue_mail(user=user_with_multiple_emails)
- mail4 = self.queue_mail(user=user_with_no_emails_sent)
- user_queue = {
- user_with_email_sent._id: [mail1],
- user_with_multiple_emails._id: [mail2, mail3],
- user_with_no_emails_sent._id: [mail4]
- }
- mails_ = list(pop_and_verify_mails_for_each_user(user_queue))
- assert len(mails_) == 2
- user_mails = [mail.user for mail in mails_]
- assert not (user_with_email_sent in user_mails)
- assert user_with_multiple_emails in user_mails
- assert user_with_no_emails_sent in user_mails
-
- def test_find_queued_mails_ready_to_be_sent(self):
- mail1 = self.queue_mail()
- mail2 = self.queue_mail(send_at=timezone.now()+timedelta(days=1))
- mail3 = self.queue_mail(send_at=timezone.now())
- mails = find_queued_mails_ready_to_be_sent()
- assert mails.count() == 2
diff --git a/tests/base.py b/tests/base.py
index e1024f8e266..b308b9dca17 100644
--- a/tests/base.py
+++ b/tests/base.py
@@ -175,9 +175,6 @@ class ApiTestCase(DbTestCase, ApiAppTestCase, SearchTestCase):
API application. Note: superclasses must call `super` in order for all setup and
teardown methods to be called correctly.
"""
- def setUp(self):
- super().setUp()
- settings.USE_EMAIL = False
class ApiAddonTestCase(ApiTestCase):
"""Base `TestCase` for tests that require interaction with addons.
diff --git a/tests/framework_tests/test_email.py b/tests/framework_tests/test_email.py
deleted file mode 100644
index c19596b7ed8..00000000000
--- a/tests/framework_tests/test_email.py
+++ /dev/null
@@ -1,108 +0,0 @@
-import unittest
-import smtplib
-
-from unittest import mock
-from unittest.mock import MagicMock
-
-import sendgrid
-from sendgrid import SendGridAPIClient
-from sendgrid.helpers.mail import Mail, Email, To, Category
-
-from framework.email.tasks import send_email, _send_with_sendgrid
-from website import settings
-from tests.base import fake
-from osf_tests.factories import fake_email
-
-# Check if local mail server is running
-SERVER_RUNNING = True
-try:
- s = smtplib.SMTP(settings.MAIL_SERVER)
- s.quit()
-except Exception as err:
- SERVER_RUNNING = False
-
-
-class TestEmail(unittest.TestCase):
-
- @unittest.skipIf(not SERVER_RUNNING,
- "Mailserver isn't running. Run \"invoke mailserver\".")
- @unittest.skipIf(not settings.USE_EMAIL,
- 'settings.USE_EMAIL is False')
- def test_sending_email(self):
- assert send_email('foo@bar.com', 'baz@quux.com', subject='no subject',
- message='
Greetings!
', ttls=False, login=False)
-
- def setUp(self):
- settings.SENDGRID_WHITELIST_MODE = False
-
- def tearDown(self):
- settings.SENDGRID_WHITELIST_MODE = True
-
- @mock.patch(f'{_send_with_sendgrid.__module__}.Mail', autospec=True)
- def test_send_with_sendgrid_success(self, mock_mail: MagicMock):
- mock_client = mock.MagicMock(autospec=SendGridAPIClient)
- mock_client.send.return_value = mock.Mock(status_code=200, body='success')
- from_addr, to_addr = fake_email(), fake_email()
- category1, category2 = fake.word(), fake.word()
- subject = fake.bs()
- message = fake.text()
- ret = _send_with_sendgrid(
- from_addr=from_addr,
- to_addr=to_addr,
- subject=subject,
- message=message,
- client=mock_client,
- categories=(category1, category2)
- )
- assert ret
-
- # Check Mail object arguments
- mock_mail.assert_called_once()
- kwargs = mock_mail.call_args.kwargs
- assert kwargs['from_email'].email == from_addr
- assert kwargs['subject'] == subject
- assert kwargs['html_content'] == message
-
- mock_mail.return_value.add_personalization.assert_called_once()
-
- # Capture the categories added via add_category
- mock_mail.return_value.add_category.assert_called_once()
- added_categories = mock_mail.return_value.add_category.call_args.args[0]
- assert len(added_categories) == 2
- assert isinstance(added_categories[0], Category)
- assert isinstance(added_categories[1], Category)
- assert added_categories[0].get() == category1
- assert added_categories[1].get() == category2
-
- mock_client.send.assert_called_once_with(mock_mail.return_value)
-
- @mock.patch(f'{_send_with_sendgrid.__module__}.sentry.log_message', autospec=True)
- @mock.patch(f'{_send_with_sendgrid.__module__}.Mail', autospec=True)
- def test_send_with_sendgrid_failure_returns_false(self, mock_mail, sentry_mock):
- mock_client = mock.MagicMock()
- mock_client.send.return_value = mock.Mock(status_code=400, body='failed')
- from_addr, to_addr = fake_email(), fake_email()
- subject = fake.bs()
- message = fake.text()
- ret = _send_with_sendgrid(
- from_addr=from_addr,
- to_addr=to_addr,
- subject=subject,
- message=message,
- client=mock_client
- )
- assert not ret
- sentry_mock.assert_called_once()
-
- # Check Mail object arguments
- mock_mail.assert_called_once()
- kwargs = mock_mail.call_args.kwargs
- assert kwargs['from_email'].email == from_addr
- assert kwargs['subject'] == subject
- assert kwargs['html_content'] == message
-
- mock_client.send.assert_called_once_with(mock_mail.return_value)
-
-
-if __name__ == '__main__':
- unittest.main()
diff --git a/tests/test_auth.py b/tests/test_auth.py
index 52156529d92..4e6ebf2265c 100644
--- a/tests/test_auth.py
+++ b/tests/test_auth.py
@@ -24,8 +24,9 @@
from framework.auth import Auth
from framework.auth.decorators import must_be_logged_in
from framework.sessions import get_session
-from osf.models import OSFUser
+from osf.models import OSFUser, NotificationType
from osf.utils import permissions
+from tests.utils import capture_notifications
from website import mails
from website import settings
from website.project.decorators import (
@@ -36,21 +37,17 @@
must_have_addon, must_be_addon_authorizer,
)
from website.util import api_url_for
-from conftest import start_mock_send_grid, start_mock_notification_send
from tests.test_cas_authentication import generate_external_user_with_resp
logger = logging.getLogger(__name__)
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class TestAuthUtils(OsfTestCase):
def setUp(self):
super().setUp()
- self.mock_send_grid = start_mock_send_grid(self)
- self.start_mock_notification_send = start_mock_notification_send(self)
def test_citation_with_only_fullname(self):
user = UserFactory()
@@ -91,24 +88,25 @@ def test_confirm_email(self):
user.reload()
token = user.get_confirmation_token(user.username)
- res = self.app.get(f'/confirm/{user._id}/{token}')
- res = self.app.resolve_redirect(res)
+ with capture_notifications() as notifications:
+ res = self.app.get(f'/confirm/{user._id}/{token}')
+ res = self.app.resolve_redirect(res)
+ assert not notifications
assert res.status_code == 302
assert 'login?service=' in res.location
user.reload()
- self.mock_send_grid.assert_not_called()
+ with capture_notifications() as notifications:
+ self.app.set_cookie(settings.COOKIE_NAME, user.get_or_create_cookie().decode())
+ res = self.app.get(f'/confirm/{user._id}/{token}')
- self.app.set_cookie(settings.COOKIE_NAME, user.get_or_create_cookie().decode())
- res = self.app.get(f'/confirm/{user._id}/{token}')
-
- res = self.app.resolve_redirect(res)
+ res = self.app.resolve_redirect(res)
assert res.status_code == 302
assert '/' == urlparse(res.location).path
- assert len(self.mock_send_grid.call_args_list) == 0
+ assert not notifications
assert len(get_session()['status']) == 1
def test_get_user_by_id(self):
@@ -172,9 +170,11 @@ def test_successful_external_first_login_without_attributes(self, mock_service_v
def test_password_change_sends_email(self):
user = UserFactory()
- user.set_password('killerqueen')
- user.save()
- assert len(self.start_mock_notification_send.call_args_list) == 1
+ with capture_notifications() as notifications:
+ user.set_password('killerqueen')
+ user.save()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_FORGOT_PASSWORD
@mock.patch('framework.auth.utils.requests.post')
def test_validate_recaptcha_success(self, req_post):
@@ -216,11 +216,15 @@ def test_sign_up_twice_sends_two_confirmation_emails_only(self):
'password': 'brutusisajerk'
}
- self.app.post(url, json=sign_up_data)
- assert len(self.mock_send_grid.call_args_list) == 1
+ with capture_notifications() as notifications:
+ self.app.post(url, json=sign_up_data)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_FORGOT_PASSWORD
- self.app.post(url, json=sign_up_data)
- assert len(self.mock_send_grid.call_args_list) == 2
+ with capture_notifications() as notifications:
+ self.app.post(url, json=sign_up_data)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_FORGOT_PASSWORD
class TestAuthObject(OsfTestCase):
diff --git a/tests/test_auth_views.py b/tests/test_auth_views.py
index 4d385b68dd6..7f2b4c4136a 100644
--- a/tests/test_auth_views.py
+++ b/tests/test_auth_views.py
@@ -40,11 +40,9 @@
)
from website import mails, settings
from website.util import api_url_for, web_url_for
-from conftest import start_mock_send_grid
pytestmark = pytest.mark.django_db
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class TestAuthViews(OsfTestCase):
@@ -53,8 +51,6 @@ def setUp(self):
self.user = AuthUserFactory()
self.auth = self.user.auth
- self.mock_send_grid = start_mock_send_grid(self)
-
def test_register_ok(self):
url = api_url_for('register_user')
name, email, password = fake.name(), fake_email(), 'underpressure'
diff --git a/tests/test_misc_views.py b/tests/test_misc_views.py
index 27c2a3e383c..d9c735b97dd 100644
--- a/tests/test_misc_views.py
+++ b/tests/test_misc_views.py
@@ -49,7 +49,6 @@
from website.project.views.node import _should_show_wiki_widget
from website.util import web_url_for
from website.util import rubeus
-from conftest import start_mock_send_grid
from tests.utils import capture_notifications
pytestmark = pytest.mark.django_db
@@ -362,7 +361,6 @@ def test_explore(self):
assert res.status_code == 200
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class TestExternalAuthViews(OsfTestCase):
@@ -385,8 +383,6 @@ def setUp(self):
self.user.save()
self.auth = (self.user.username, password)
- self.mock_send_grid = start_mock_send_grid(self)
-
def test_external_login_email_get_with_invalid_session(self):
url = web_url_for('external_login_email_get')
resp = self.app.get(url)
@@ -410,13 +406,13 @@ def test_external_login_confirm_email_get_create(self):
# TODO: check in qa url encoding
assert not self.user.is_registered
url = self.user.get_confirmation_url(self.user.username, external_id_provider='orcid', destination='dashboard')
- res = self.app.get(url)
+ with capture_notifications() as notifications:
+ res = self.app.get(url)
+ assert not notifications
assert res.status_code == 302, 'redirects to cas login'
assert '/login?service=' in res.location
assert quote_plus('new=true') in res.location
- assert self.mock_send_grid.call_count == 0
-
self.user.reload()
assert self.user.external_identity['orcid'][self.provider_id] == 'VERIFIED'
assert self.user.is_registered
@@ -436,7 +432,6 @@ def test_external_login_confirm_email_get_link(self):
assert '/login?service=' in res.location
assert 'new=true' not in parse.unquote(res.location)
-
self.user.reload()
assert self.user.external_identity['orcid'][self.provider_id] == 'VERIFIED'
assert self.user.is_registered
@@ -446,13 +441,13 @@ def test_external_login_confirm_email_get_duped_id(self):
dupe_user = UserFactory(external_identity={'orcid': {self.provider_id: 'CREATE'}})
assert dupe_user.external_identity == self.user.external_identity
url = self.user.get_confirmation_url(self.user.username, external_id_provider='orcid', destination='dashboard')
- res = self.app.get(url)
+ with capture_notifications() as notifications:
+ res = self.app.get(url)
+ assert not notifications
assert res.status_code == 302, 'redirects to cas login'
assert 'You should be redirected automatically' in str(res.html)
assert '/login?service=' in res.location
- assert self.mock_send_grid.call_count == 0
-
self.user.reload()
dupe_user.reload()
@@ -462,11 +457,11 @@ def test_external_login_confirm_email_get_duped_id(self):
def test_external_login_confirm_email_get_duping_id(self):
dupe_user = UserFactory(external_identity={'orcid': {self.provider_id: 'VERIFIED'}})
url = self.user.get_confirmation_url(self.user.username, external_id_provider='orcid', destination='dashboard')
- res = self.app.get(url)
+ with capture_notifications() as notifications:
+ res = self.app.get(url)
+ assert not notifications
assert res.status_code == 403, 'only allows one user to link an id'
- assert self.mock_send_grid.call_count == 0
-
self.user.reload()
dupe_user.reload()
diff --git a/tests/test_preprints.py b/tests/test_preprints.py
index df1be915bab..9f16edc1e58 100644
--- a/tests/test_preprints.py
+++ b/tests/test_preprints.py
@@ -53,8 +53,6 @@
update_or_enqueue_on_preprint_updated,
should_update_preprint_identifiers
)
-from conftest import start_mock_send_grid
-
SessionStore = import_module(django_conf_settings.SESSION_ENGINE).SessionStore
@@ -971,7 +969,7 @@ def test_confirm_ham_on_public_preprint_stays_public(self, preprint, user):
@mock.patch.object(settings, 'SPAM_SERVICES_ENABLED', True)
@mock.patch.object(settings, 'SPAM_ACCOUNT_SUSPENSION_ENABLED', True)
@pytest.mark.skip('Technically still true, but skipping because mocking is outdated')
- def test_check_spam_on_private_preprint_bans_new_spam_user(self, mock_send_mail, preprint, user):
+ def test_check_spam_on_private_preprint_bans_new_spam_user(self, preprint, user):
preprint.is_public = False
preprint.save()
with mock.patch('osf.models.Preprint._get_spam_content', mock.Mock(return_value='some content!')):
@@ -1001,7 +999,7 @@ def test_check_spam_on_private_preprint_bans_new_spam_user(self, mock_send_mail,
@mock.patch('website.mailchimp_utils.unsubscribe_mailchimp')
@mock.patch.object(settings, 'SPAM_SERVICES_ENABLED', True)
@mock.patch.object(settings, 'SPAM_ACCOUNT_SUSPENSION_ENABLED', True)
- def test_check_spam_on_private_preprint_does_not_ban_existing_user(self, mock_send_mail, preprint, user):
+ def test_check_spam_on_private_preprint_does_not_ban_existing_user(self, preprint, user):
preprint.is_public = False
preprint.save()
with mock.patch('osf.models.Preprint._get_spam_content', mock.Mock(return_value='some content!')):
@@ -1985,7 +1983,6 @@ def test_update_or_enqueue_on_preprint_doi_created(self):
assert should_update_preprint_identifiers(self.private_preprint, {})
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class TestPreprintConfirmationEmails(OsfTestCase):
def setUp(self):
@@ -1996,7 +1993,6 @@ def setUp(self):
self.preprint = PreprintFactory(creator=self.user, project=self.project, provider=PreprintProviderFactory(_id='osf'), is_published=False)
self.preprint.add_contributor(self.write_contrib, permissions=WRITE)
self.preprint_branded = PreprintFactory(creator=self.user, is_published=False)
- self.mock_send_grid = start_mock_send_grid(self)
def test_creator_gets_email(self):
with capture_notifications() as notifications:
diff --git a/tests/test_registrations/test_embargoes.py b/tests/test_registrations/test_embargoes.py
index 992a968f224..7b06887c86b 100644
--- a/tests/test_registrations/test_embargoes.py
+++ b/tests/test_registrations/test_embargoes.py
@@ -29,7 +29,7 @@
from osf.models.sanctions import SanctionCallbackMixin, Embargo
from osf.utils import permissions
from osf.models import Registration, Contributor, OSFUser, SpamStatus
-from conftest import start_mock_notification_send
+from tests.utils import capture_notifications
DUMMY_TOKEN = tokens.encode({
'dummy': 'token'
@@ -1060,7 +1060,6 @@ def test_GET_from_authorized_user_with_registration_rej_token_deleted_node(self)
@pytest.mark.enable_bookmark_creation
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class RegistrationEmbargoViewsTestCase(OsfTestCase):
def setUp(self):
@@ -1101,9 +1100,6 @@ def setUp(self):
}
})
- self.start_mock_notification_send = start_mock_notification_send(self)
-
-
@mock.patch('osf.models.sanctions.EmailApprovableSanction.ask')
def test_embargoed_registration_set_privacy_requests_embargo_termination(self, mock_ask):
# Initiate and approve embargo
@@ -1154,13 +1150,14 @@ def test_embargoed_registration_set_privacy_sends_mail(self):
self.registration.embargo.approve_embargo(OSFUser.load(user_id), approval_token)
self.registration.refresh_from_db()
- self.registration.set_privacy('public', Auth(self.registration.creator))
+ with capture_notifications() as notifications:
+ self.registration.set_privacy('public', Auth(self.registration.creator))
admin_contributors = []
for contributor in self.registration.contributors:
if Contributor.objects.get(user_id=contributor.id, node_id=self.registration.id).permission == permissions.ADMIN:
admin_contributors.append(contributor)
for admin in admin_contributors:
- assert any([each[1]['to_addr'] == admin.username for each in self.start_mock_notification_send.call_args_list])
+ assert any([each['kwargs']['user'] == admin for each in notifications])
@mock.patch('osf.models.sanctions.EmailApprovableSanction.ask')
def test_make_child_embargoed_registration_public_asks_all_admins_in_tree(self, mock_ask):
diff --git a/tests/test_registrations/test_retractions.py b/tests/test_registrations/test_retractions.py
index d3f8cb72abf..5874fad6fa6 100644
--- a/tests/test_registrations/test_retractions.py
+++ b/tests/test_registrations/test_retractions.py
@@ -22,10 +22,9 @@
InvalidSanctionApprovalToken, InvalidSanctionRejectionToken,
NodeStateError,
)
-from osf.models import Contributor, Retraction
+from osf.models import Contributor, Retraction, NotificationType
from osf.utils import permissions
-from conftest import start_mock_notification_send
-
+from tests.utils import capture_notifications
@pytest.mark.enable_bookmark_creation
@@ -753,7 +752,6 @@ def test_POST_retraction_to_subproject_component_returns_HTTPError_BAD_REQUEST(s
@pytest.mark.enable_bookmark_creation
@pytest.mark.usefixtures('mock_gravy_valet_get_verified_links')
-@mock.patch('website.mails.settings.USE_EMAIL', True)
@mock.patch('website.mails.settings.USE_CELERY', False)
class RegistrationRetractionViewsTestCase(OsfTestCase):
def setUp(self):
@@ -767,8 +765,6 @@ def setUp(self):
self.retraction_get_url = self.registration.web_url_for('node_registration_retraction_get')
self.justification = fake.sentence()
- self.start_mock_notification_send = start_mock_notification_send(self)
-
def test_GET_retraction_page_when_pending_retraction_returns_HTTPError_BAD_REQUEST(self):
self.registration.retract_registration(self.user)
self.registration.save()
@@ -802,12 +798,14 @@ def test_POST_retraction_does_not_send_email_to_unregistered_admins(self):
existing_user=unreg
)
self.registration.save()
- self.app.post(
- self.retraction_post_url,
- json={'justification': ''},
- auth=self.user.auth,
- )
- assert self.start_mock_notification_send.call_count == 1
+ with capture_notifications() as notifications:
+ self.app.post(
+ self.retraction_post_url,
+ json={'justification': ''},
+ auth=self.user.auth,
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
def test_POST_pending_embargo_returns_HTTPError_HTTPOK(self):
self.registration.embargo_registration(
@@ -892,12 +890,14 @@ def test_valid_POST_retraction_when_pending_retraction_raises_400(self):
assert res.status_code == 400
def test_valid_POST_calls_send_mail_with_username(self):
- self.app.post(
- self.retraction_post_url,
- json={'justification': ''},
- auth=self.user.auth,
- )
- assert self.start_mock_notification_send.called
+ with capture_notifications() as notifications:
+ self.app.post(
+ self.retraction_post_url,
+ json={'justification': ''},
+ auth=self.user.auth,
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
def test_non_contributor_GET_approval_returns_HTTPError_FORBIDDEN(self):
non_contributor = AuthUserFactory()
diff --git a/tests/test_spam_mixin.py b/tests/test_spam_mixin.py
index 0713d0b4c54..af509272425 100644
--- a/tests/test_spam_mixin.py
+++ b/tests/test_spam_mixin.py
@@ -10,22 +10,24 @@
from tests.base import DbTestCase
from osf_tests.factories import UserFactory, CommentFactory, ProjectFactory, PreprintFactory, RegistrationFactory, AuthUserFactory
-from osf.models import NotableDomain, SpamStatus
+from osf.models import NotableDomain, SpamStatus, NotificationType
+from tests.utils import capture_notifications
from website import settings, mails
@pytest.mark.django_db
-@pytest.mark.usefixtures('mock_send_grid')
-def test_throttled_autoban(mock_send_grid):
+def test_throttled_autoban():
settings.SPAM_THROTTLE_AUTOBAN = True
user = AuthUserFactory()
projects = []
- for _ in range(7):
- proj = ProjectFactory(creator=user)
- proj.flag_spam()
- proj.save()
- projects.append(proj)
- mock_send_grid.assert_called()
+ with capture_notifications() as notifications:
+ for _ in range(7):
+ proj = ProjectFactory(creator=user)
+ proj.flag_spam()
+ proj.save()
+ projects.append(proj)
+ assert len(notifications) == 7
+ assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
user.reload()
assert user.is_disabled
for project in projects:
diff --git a/tests/test_user_profile_view.py b/tests/test_user_profile_view.py
index 20095abfba1..49ff6076d34 100644
--- a/tests/test_user_profile_view.py
+++ b/tests/test_user_profile_view.py
@@ -27,7 +27,6 @@
from website import mailchimp_utils
from website.settings import MAILCHIMP_GENERAL_LIST
from website.util import api_url_for, web_url_for
-from conftest import start_mock_send_grid
@pytest.mark.enable_enqueue_task
@@ -515,8 +514,6 @@ def setUp(self):
self.user.auth = (self.user.username, 'password')
self.user.save()
- self.mock_send_grid = start_mock_send_grid(self)
-
def test_password_change_valid(self,
old_password='password',
new_password='Pa$$w0rd',
diff --git a/website/mails/mails.py b/website/mails/mails.py
index b98b7c37b87..84e82d4632a 100644
--- a/website/mails/mails.py
+++ b/website/mails/mails.py
@@ -176,17 +176,16 @@ def send_mail(
)
logger.debug('Preparing to send...')
- if settings.USE_EMAIL:
- if settings.USE_CELERY and celery:
- logger.debug('Sending via celery...')
- return mailer.apply_async(kwargs=kwargs, link=callback)
- else:
- logger.debug('Sending without celery')
- ret = mailer(**kwargs)
- if callback:
- callback()
-
- return ret
+ if settings.USE_CELERY and celery:
+ logger.debug('Sending via celery...')
+ return mailer.apply_async(kwargs=kwargs, link=callback)
+ else:
+ logger.debug('Sending without celery')
+ ret = mailer(**kwargs)
+ if callback:
+ callback()
+
+ return ret
def get_english_article(word):
diff --git a/website/notifications/tasks.py b/website/notifications/tasks.py
deleted file mode 100644
index 6b7353ccdc0..00000000000
--- a/website/notifications/tasks.py
+++ /dev/null
@@ -1,227 +0,0 @@
-"""
-Tasks for making even transactional emails consolidated.
-"""
-import itertools
-
-from django.db import connection
-
-from framework.celery_tasks import app as celery_app
-from framework.sentry import log_message
-from osf.models import (
- OSFUser,
- AbstractNode,
- AbstractProvider,
- RegistrationProvider,
- CollectionProvider,
- NotificationDigest,
-)
-from osf.registrations.utils import get_registration_provider_submissions_url
-from osf.utils.permissions import ADMIN
-from website import mails, settings
-from website.notifications.utils import NotificationsDict
-
-
-@celery_app.task(name='website.notifications.tasks.send_users_email', max_retries=0)
-def send_users_email(send_type):
- """Send pending emails.
-
- :param send_type
- :return:
- """
- _send_global_and_node_emails(send_type)
- _send_reviews_moderator_emails(send_type)
-
-
-def _send_global_and_node_emails(send_type):
- """
- Called by `send_users_email`. Send all global and node-related notification emails.
- """
- grouped_emails = get_users_emails(send_type)
- for group in grouped_emails:
- user = OSFUser.load(group['user_id'])
- if not user:
- log_message(f"User with id={group['user_id']} not found")
- continue
- info = group['info']
- notification_ids = [message['_id'] for message in info]
- sorted_messages = group_by_node(info)
- if sorted_messages:
- if not user.is_disabled:
- # If there's only one node in digest we can show it's preferences link in the template.
- notification_nodes = list(sorted_messages['children'].keys())
- node = AbstractNode.load(notification_nodes[0]) if len(
- notification_nodes) == 1 else None
- mails.send_mail(
- to_addr=user.username,
- can_change_node_preferences=bool(node),
- node=node,
- mail=mails.DIGEST,
- name=user.fullname,
- message=sorted_messages,
- )
- remove_notifications(email_notification_ids=notification_ids)
-
-
-def _send_reviews_moderator_emails(send_type):
- """
- Called by `send_users_email`. Send all reviews triggered emails.
- """
- grouped_emails = get_moderators_emails(send_type)
- for group in grouped_emails:
- user = OSFUser.load(group['user_id'])
- info = group['info']
- notification_ids = [message['_id'] for message in info]
- provider = AbstractProvider.objects.get(id=group['provider_id'])
- additional_context = dict()
- if isinstance(provider, RegistrationProvider):
- provider_type = 'registration'
- submissions_url = get_registration_provider_submissions_url(provider)
- withdrawals_url = f'{submissions_url}?state=pending_withdraw'
- notification_settings_url = f'{settings.DOMAIN}registries/{provider._id}/moderation/notifications'
- if provider.brand:
- additional_context = {
- 'logo_url': provider.brand.hero_logo_image,
- 'top_bar_color': provider.brand.primary_color
- }
- elif isinstance(provider, CollectionProvider):
- provider_type = 'collection'
- submissions_url = f'{settings.DOMAIN}collections/{provider._id}/moderation/'
- notification_settings_url = f'{settings.DOMAIN}registries/{provider._id}/moderation/notifications'
- if provider.brand:
- additional_context = {
- 'logo_url': provider.brand.hero_logo_image,
- 'top_bar_color': provider.brand.primary_color
- }
- withdrawals_url = ''
- else:
- provider_type = 'preprint'
- submissions_url = f'{settings.DOMAIN}reviews/preprints/{provider._id}',
- withdrawals_url = ''
- notification_settings_url = f'{settings.DOMAIN}reviews/{provider_type}s/{provider._id}/notifications'
-
- if not user.is_disabled:
- mails.send_mail(
- to_addr=user.username,
- mail=mails.DIGEST_REVIEWS_MODERATORS,
- name=user.fullname,
- message=info,
- provider_name=provider.name,
- reviews_submissions_url=submissions_url,
- notification_settings_url=notification_settings_url,
- reviews_withdrawal_url=withdrawals_url,
- is_reviews_moderator_notification=True,
- is_admin=provider.get_group(ADMIN).user_set.filter(id=user.id).exists(),
- provider_type=provider_type,
- **additional_context
- )
- remove_notifications(email_notification_ids=notification_ids)
-
-
-def get_moderators_emails(send_type):
- """Get all emails for reviews moderators that need to be sent, grouped by users AND providers.
- :param send_type: from NOTIFICATION_TYPES, could be "email_digest" or "email_transactional"
- :return Iterable of dicts of the form:
- [
- 'user_id': 'se8ea',
- 'provider_id': '1',
- 'info': [
- {
- 'message': 'Hana Xie submitted Gravity',
- '_id': NotificationDigest._id,
- }
- ],
- ]
- """
- sql = """
- SELECT json_build_object(
- 'user_id', osf_guid._id,
- 'provider_id', nd.provider_id,
- 'info', json_agg(
- json_build_object(
- 'message', nd.message,
- '_id', nd._id
- )
- )
- )
- FROM osf_notificationdigest AS nd
- LEFT JOIN osf_guid ON nd.user_id = osf_guid.object_id
- WHERE send_type = %s AND (event = 'new_pending_submissions' OR event = 'new_pending_withdraw_requests')
- AND osf_guid.content_type_id = (SELECT id FROM django_content_type WHERE model = 'osfuser')
- GROUP BY osf_guid.id, nd.provider_id
- ORDER BY osf_guid.id ASC
- """
-
- with connection.cursor() as cursor:
- cursor.execute(sql, [send_type, ])
- return itertools.chain.from_iterable(cursor.fetchall())
-
-
-def get_users_emails(send_type):
- """Get all emails that need to be sent.
- NOTE: These do not include reviews triggered emails for moderators.
-
- :param send_type: from NOTIFICATION_TYPES
- :return: Iterable of dicts of the form:
- {
- 'user_id': 'se8ea',
- 'info': [{
- 'message': {
- 'message': 'Freddie commented on your project Open Science',
- 'timestamp': datetime object
- },
- 'node_lineage': ['parent._id', 'node._id'],
- '_id': NotificationDigest._id
- }, ...
- }]
- {
- 'user_id': ...
- }
- }
- """
-
- sql = """
- SELECT json_build_object(
- 'user_id', osf_guid._id,
- 'info', json_agg(
- json_build_object(
- 'message', nd.message,
- 'node_lineage', nd.node_lineage,
- '_id', nd._id
- )
- )
- )
- FROM osf_notificationdigest AS nd
- LEFT JOIN osf_guid ON nd.user_id = osf_guid.object_id
- WHERE send_type = %s
- AND event != 'new_pending_submissions'
- AND event != 'new_pending_withdraw_requests'
- AND osf_guid.content_type_id = (SELECT id FROM django_content_type WHERE model = 'osfuser')
- GROUP BY osf_guid.id
- ORDER BY osf_guid.id ASC
- """
-
- with connection.cursor() as cursor:
- cursor.execute(sql, [send_type, ])
- return itertools.chain.from_iterable(cursor.fetchall())
-
-
-def group_by_node(notifications, limit=15):
- """Take list of notifications and group by node.
-
- :param notifications: List of stored email notifications
- :return:
- """
- emails = NotificationsDict()
- for notification in notifications[:15]:
- emails.add_message(notification['node_lineage'], notification['message'])
- return emails
-
-
-def remove_notifications(email_notification_ids=None):
- """Remove sent emails.
-
- :param email_notification_ids:
- :return:
- """
- if email_notification_ids:
- NotificationDigest.objects.filter(_id__in=email_notification_ids).delete()
diff --git a/website/settings/defaults.py b/website/settings/defaults.py
index badafc32862..5d39c01ab90 100644
--- a/website/settings/defaults.py
+++ b/website/settings/defaults.py
@@ -12,6 +12,8 @@
from collections import OrderedDict
import enum
+from celery.schedules import crontab
+
os_env = os.environ
@@ -140,7 +142,6 @@ def parent_dir(path):
# External services
USE_CDN_FOR_CLIENT_LIBS = True
-USE_EMAIL = True
FROM_EMAIL = 'openscienceframework-noreply@osf.io'
ENABLE_TEST_EMAIL = False
# support email
@@ -550,7 +551,6 @@ class CeleryConfig:
# Modules to import when celery launches
imports = (
'framework.celery_tasks',
- 'framework.email.tasks',
'osf.external.chronos.tasks',
'osf.management.commands.data_storage_usage',
'osf.management.commands.registration_schema_metrics',
@@ -598,149 +598,104 @@ class CeleryConfig:
# 'scripts.analytics.upload',
# )
- # celery.schedule will not be installed when running invoke requirements the first time.
- try:
- from celery.schedules import crontab
- except ImportError:
- pass
- else:
- # Setting up a scheduler, essentially replaces an independent cron job
- # Note: these times must be in UTC
- beat_schedule = {
- '5-minute-emails': {
- 'task': 'website.notifications.tasks.send_users_email',
- 'schedule': crontab(minute='*/5'),
- 'args': ('email_transactional',),
- },
- 'daily-emails': {
- 'task': 'website.notifications.tasks.send_users_email',
- 'schedule': crontab(minute=0, hour=5), # Daily at 12 a.m. EST
- 'args': ('email_digest',),
- },
- # 'refresh_addons': { # Handled by GravyValet now
- # 'task': 'scripts.refresh_addon_tokens',
- # 'schedule': crontab(minute=0, hour=7), # Daily 2:00 a.m
- # 'kwargs': {'dry_run': False, 'addons': {
- # 'box': 60, # https://docs.box.com/docs/oauth-20#section-6-using-the-access-and-refresh-tokens
- # 'googledrive': 14, # https://developers.google.com/identity/protocols/OAuth2#expiration
- # 'mendeley': 14 # http://dev.mendeley.com/reference/topics/authorization_overview.html
- # }},
- # },
- 'retract_registrations': {
- 'task': 'scripts.retract_registrations',
- 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
- 'kwargs': {'dry_run': False},
- },
- 'embargo_registrations': {
- 'task': 'scripts.embargo_registrations',
- 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
- 'kwargs': {'dry_run': False},
- },
- 'add_missing_identifiers_to_preprints': {
- 'task': 'scripts.add_missing_identifiers_to_preprints',
- 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
- 'kwargs': {'dry_run': False},
- },
- 'approve_registrations': {
- 'task': 'scripts.approve_registrations',
- 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
- 'kwargs': {'dry_run': False},
- },
- 'approve_embargo_terminations': {
- 'task': 'scripts.approve_embargo_terminations',
- 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
- 'kwargs': {'dry_run': False},
- },
- 'triggered_mails': {
- 'task': 'scripts.triggered_mails',
- 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
- 'kwargs': {'dry_run': False},
- },
- 'clear_expired_sessions': {
- 'task': 'osf.management.commands.clear_expired_sessions',
- 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
- 'kwargs': {'dry_run': False},
- },
- 'send_queued_mails': {
- 'task': 'scripts.send_queued_mails',
- 'schedule': crontab(minute=0, hour=17), # Daily 12 p.m.
- 'kwargs': {'dry_run': False},
- },
- 'new-and-noteworthy': {
- 'task': 'scripts.populate_new_and_noteworthy_projects',
- 'schedule': crontab(minute=0, hour=7, day_of_week=6), # Saturday 2:00 a.m.
- 'kwargs': {'dry_run': False}
- },
- 'registration_schema_metrics': {
- 'task': 'management.commands.registration_schema_metrics',
- 'schedule': crontab(minute=45, hour=7, day_of_month=3), # Third day of month 2:45 a.m.
- 'kwargs': {'dry_run': False}
- },
- 'daily_reporters_go': {
- 'task': 'management.commands.daily_reporters_go',
- 'schedule': crontab(minute=0, hour=6), # Daily 1:00 a.m.
- },
- 'monthly_reporters_go': {
- 'task': 'management.commands.monthly_reporters_go',
- 'schedule': crontab(minute=30, hour=6, day_of_month=2), # Second day of month 1:30 a.m.
- },
- # 'data_storage_usage': {
- # 'task': 'management.commands.data_storage_usage',
- # 'schedule': crontab(day_of_month=1, minute=30, hour=4), # Last of the month at 11:30 p.m.
- # },
- # 'migrate_pagecounter_data': {
- # 'task': 'management.commands.migrate_pagecounter_data',
- # 'schedule': crontab(minute=0, hour=7), # Daily 2:00 a.m.
- # },
- # 'migrate_deleted_date': {
- # 'task': 'management.commands.migrate_deleted_date',
- # 'schedule': crontab(minute=0, hour=3),
- # 'addon_deleted_date': {
- # 'task': 'management.commands.addon_deleted_date',
- # 'schedule': crontab(minute=0, hour=3), # Daily 11:00 p.m.
- # },
- 'generate_sitemap': {
- 'task': 'scripts.generate_sitemap',
- 'schedule': crontab(minute=0, hour=5), # Daily 12:00 a.m.
- },
- 'deactivate_requested_accounts': {
- 'task': 'management.commands.deactivate_requested_accounts',
- 'schedule': crontab(minute=0, hour=5), # Daily 12:00 a.m.
- },
- 'check_crossref_doi': {
- 'task': 'management.commands.check_crossref_dois',
- 'schedule': crontab(minute=0, hour=4), # Daily 11:00 p.m.
- },
- 'update_institution_project_counts': {
- 'task': 'management.commands.update_institution_project_counts',
- 'schedule': crontab(minute=0, hour=9), # Daily 05:00 a.m. EDT
- },
-# 'archive_registrations_on_IA': {
-# 'task': 'osf.management.commands.archive_registrations_on_IA',
-# 'schedule': crontab(minute=0, hour=5), # Daily 4:00 a.m.
-# 'kwargs': {'dry_run': False}
-# },
- 'delete_withdrawn_or_failed_registration_files': {
- 'task': 'management.commands.delete_withdrawn_or_failed_registration_files',
- 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
- 'kwargs': {
- 'dry_run': False,
- 'batch_size_withdrawn': 10,
- 'batch_size_stuck': 10
- }
- },
- 'monitor_registration_bulk_upload_jobs': {
- 'task': 'api.providers.tasks.monitor_registration_bulk_upload_jobs',
- # 'schedule': crontab(hour='*/3'), # Every 3 hours
- 'schedule': crontab(minute='*/5'), # Every 5 minutes for staging server QA test
- 'kwargs': {'dry_run': False}
- },
- 'approve_registration_updates': {
- 'task': 'osf.management.commands.approve_pending_schema_responses',
- 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
- 'kwargs': {'dry_run': False},
- },
- }
+ # Setting up a scheduler, essentially replaces an independent cron job
+ # Note: these times must be in UTC
+ beat_schedule = {
+ 'retract_registrations': {
+ 'task': 'scripts.retract_registrations',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
+ 'kwargs': {'dry_run': False},
+ },
+ 'embargo_registrations': {
+ 'task': 'scripts.embargo_registrations',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
+ 'kwargs': {'dry_run': False},
+ },
+ 'add_missing_identifiers_to_preprints': {
+ 'task': 'scripts.add_missing_identifiers_to_preprints',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
+ 'kwargs': {'dry_run': False},
+ },
+ 'approve_registrations': {
+ 'task': 'scripts.approve_registrations',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
+ 'kwargs': {'dry_run': False},
+ },
+ 'approve_embargo_terminations': {
+ 'task': 'scripts.approve_embargo_terminations',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
+ 'kwargs': {'dry_run': False},
+ },
+ 'triggered_mails': {
+ 'task': 'scripts.triggered_mails',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
+ 'kwargs': {'dry_run': False},
+ },
+ 'clear_expired_sessions': {
+ 'task': 'osf.management.commands.clear_expired_sessions',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
+ 'kwargs': {'dry_run': False},
+ },
+ 'send_queued_mails': {
+ 'task': 'scripts.send_queued_mails',
+ 'schedule': crontab(minute=0, hour=17), # Daily 12 p.m.
+ 'kwargs': {'dry_run': False},
+ },
+ 'new-and-noteworthy': {
+ 'task': 'scripts.populate_new_and_noteworthy_projects',
+ 'schedule': crontab(minute=0, hour=7, day_of_week=6), # Saturday 2:00 a.m.
+ 'kwargs': {'dry_run': False}
+ },
+ 'registration_schema_metrics': {
+ 'task': 'management.commands.registration_schema_metrics',
+ 'schedule': crontab(minute=45, hour=7, day_of_month=3), # Third day of month 2:45 a.m.
+ 'kwargs': {'dry_run': False}
+ },
+ 'daily_reporters_go': {
+ 'task': 'management.commands.daily_reporters_go',
+ 'schedule': crontab(minute=0, hour=6), # Daily 1:00 a.m.
+ },
+ 'monthly_reporters_go': {
+ 'task': 'management.commands.monthly_reporters_go',
+ 'schedule': crontab(minute=30, hour=6, day_of_month=2), # Second day of month 1:30 a.m.
+ },
+ 'generate_sitemap': {
+ 'task': 'scripts.generate_sitemap',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12:00 a.m.
+ },
+ 'deactivate_requested_accounts': {
+ 'task': 'management.commands.deactivate_requested_accounts',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12:00 a.m.
+ },
+ 'check_crossref_doi': {
+ 'task': 'management.commands.check_crossref_dois',
+ 'schedule': crontab(minute=0, hour=4), # Daily 11:00 p.m.
+ },
+ 'update_institution_project_counts': {
+ 'task': 'management.commands.update_institution_project_counts',
+ 'schedule': crontab(minute=0, hour=9), # Daily 05:00 a.m. EDT
+ },
+ 'delete_withdrawn_or_failed_registration_files': {
+ 'task': 'management.commands.delete_withdrawn_or_failed_registration_files',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
+ 'kwargs': {
+ 'dry_run': False,
+ 'batch_size_withdrawn': 10,
+ 'batch_size_stuck': 10
+ }
+ },
+ 'monitor_registration_bulk_upload_jobs': {
+ 'task': 'api.providers.tasks.monitor_registration_bulk_upload_jobs',
+ # 'schedule': crontab(hour='*/3'), # Every 3 hours
+ 'schedule': crontab(minute='*/5'), # Every 5 minutes for staging server QA test
+ 'kwargs': {'dry_run': False}
+ },
+ 'approve_registration_updates': {
+ 'task': 'osf.management.commands.approve_pending_schema_responses',
+ 'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
+ 'kwargs': {'dry_run': False},
+ },
+ }
# Tasks that need metrics and release requirements
# beat_schedule.update({
diff --git a/website/settings/local-ci.py b/website/settings/local-ci.py
index c63fce5a86a..2cab1ca4252 100644
--- a/website/settings/local-ci.py
+++ b/website/settings/local-ci.py
@@ -44,7 +44,6 @@
SEARCH_ENGINE = 'elastic'
-USE_EMAIL = False
USE_CELERY = False
# Email
diff --git a/website/settings/local-dist.py b/website/settings/local-dist.py
index 212b9926f7e..4124d621450 100644
--- a/website/settings/local-dist.py
+++ b/website/settings/local-dist.py
@@ -57,7 +57,6 @@
ELASTIC_TIMEOUT = 10
# Email
-USE_EMAIL = False
MAIL_SERVER = 'localhost:1025' # For local testing
MAIL_USERNAME = 'osf-smtp'
MAIL_PASSWORD = 'CHANGEME'
From 8c6785a7900ef39b4103ecc095b2b957bd096a2c Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Sun, 27 Jul 2025 17:16:15 -0400
Subject: [PATCH 124/336] update tasks for queued_email system
---
.../views/test_draft_registration_list.py | 1 -
framework/auth/campaigns.py | 9 -
notifications.yaml | 27 ++-
.../commands/check_crossref_dois.py | 3 +-
osf/management/commands/email_all_users.py | 4 +-
osf/management/commands/find_spammy_files.py | 3 +-
osf/migrations/0033_delete_queuedmail.py | 16 ++
osf/models/__init__.py | 1 -
osf/models/institution.py | 3 +-
osf/models/notification_type.py | 1 +
osf/models/queued_mail.py | 162 ---------------
scripts/send_queued_mails.py | 66 ------
scripts/stuck_registration_audit.py | 19 +-
scripts/tests/test_triggered_mails.py | 56 -----
scripts/triggered_mails.py | 50 -----
website/app.py | 1 -
website/archiver/utils.py | 9 +-
website/conferences/views.py | 74 +------
website/mails/listeners.py | 44 ----
website/mails/mails.py | 193 ------------------
website/mails/presends.py | 55 -----
website/notifications/constants.py | 13 +-
website/notifications/listeners.py | 24 +--
website/notifications/utils.py | 18 +-
website/reviews/listeners.py | 9 +-
25 files changed, 89 insertions(+), 772 deletions(-)
create mode 100644 osf/migrations/0033_delete_queuedmail.py
delete mode 100644 osf/models/queued_mail.py
delete mode 100644 scripts/send_queued_mails.py
delete mode 100644 scripts/tests/test_triggered_mails.py
delete mode 100644 scripts/triggered_mails.py
delete mode 100644 website/mails/listeners.py
delete mode 100644 website/mails/presends.py
diff --git a/api_tests/draft_registrations/views/test_draft_registration_list.py b/api_tests/draft_registrations/views/test_draft_registration_list.py
index b90493825ee..cc409555e10 100644
--- a/api_tests/draft_registrations/views/test_draft_registration_list.py
+++ b/api_tests/draft_registrations/views/test_draft_registration_list.py
@@ -428,7 +428,6 @@ def test_admin_can_create_draft(
assert draft.has_permission(user, ADMIN) is True
def test_create_no_project_draft_emails_initiator(self, app, user, url_draft_registrations, payload):
- # Intercepting the send_mail call from website.project.views.contributor.notify_added_contributor
with capture_notifications() as notifications:
app.post_json_api(
f'{url_draft_registrations}?embed=branched_from&embed=initiator',
diff --git a/framework/auth/campaigns.py b/framework/auth/campaigns.py
index a47b3cf637b..74445e6c259 100644
--- a/framework/auth/campaigns.py
+++ b/framework/auth/campaigns.py
@@ -91,15 +91,6 @@ def get_campaigns():
}
})
- newest_campaigns.update({
- 'agu_conference_2023': {
- 'system_tag': CampaignSourceTags.AguConference2023.value,
- 'redirect_url': furl(DOMAIN).add(path='dashboard/').url,
- 'confirmation_email_template': mails.CONFIRM_EMAIL_AGU_CONFERENCE_2023,
- 'login_type': 'native',
- }
- })
-
newest_campaigns.update({
'agu_conference': {
'system_tag': CampaignSourceTags.AguConference.value,
diff --git a/notifications.yaml b/notifications.yaml
index 2e8b08ee6f6..fe5186bb8fd 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -4,6 +4,7 @@
notification_types:
#### User Notifications
- name: user_pending_verification_registered
+ subject: 'Received request to be a contributor'
__docs__: This email is sent when a user requests access to a node and has confirm their identity,
`referrer` is sent an email to forward the confirmation link.
object_content_type_model_name: osfuser
@@ -37,10 +38,12 @@ notification_types:
object_content_type_model_name: osfuser
template: 'website/templates/emails/contributor_added_preprint_node_from_osf.html.mako'
- name: user_external_login_link_success
+ subject: 'OSF Verification Success'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/external_confirm_success.html.mako'
- name: user_confirm_email
+ subject: 'Add a new email to your OSF account'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/confirm.html.mako'
@@ -73,25 +76,28 @@ notification_types:
object_content_type_model_name: osfuser
template: 'website/templates/emails/pending_invite.html.mako'
- name: user_forward_invite_registered
+ subject: 'Please forward to ${fullname}'
__docs__: ...
object_content_type_model_name: osfuser
- template: 'website/templates/emails/forward_invite.html.mako'
+ template: 'website/templates/emails/forward_invite_registered.html.mako'
- name: user_forward_invite
+ subject: 'Please forward to ${fullname}'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/forward_invite.html.mako'
- name: user_initial_confirm_email
- __docs__: ...
+ subject: 'OSF Account Verification'
+ __docs__: 'Sign up confirmation emails for OSF, native campaigns and branded campaigns'
object_content_type_model_name: osfuser
template: 'website/templates/emails/initial_confirm.html.mako'
- name: user_export_data_request
__docs__: ...
object_content_type_model_name: osfuser
- template: 'website/templates/emails/initial_confirm.html.mako'
+ template: 'website/templates/emails/.html.mako'
- name: user_request_deactivation
__docs__: ...
object_content_type_model_name: osfuser
- template: 'website/templates/emails/initial_confirm.html.mako'
+ template: 'website/templates/emails/.html.mako'
- name: user_storage_cap_exceeded_announcement
__docs__: ...
object_content_type_model_name: osfuser
@@ -100,19 +106,17 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/duplicate_accounts_sso_osf4i.html.mako'
- - name: user_external_confirm_success_lik
- __docs__: ...
- object_content_type_model_name: osfuser
- template: 'website/templates/emails/external_confirm_success.html.mako'
- name: user_duplicate_accounts_osf4i
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/duplicate_accounts_sso_osf4i.html.mako'
- name: user_forgot_password
+ subject: 'Reset Password'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/forgot_password.html.mako'
- name: user_forgot_password_institution
+ subject: 'Set Password'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/forgot_password_institution.html.mako'
@@ -133,7 +137,8 @@ notification_types:
object_content_type_model_name: osfuser
template: 'website/templates/emails/registration_bulk_upload_failure_duplicates.html.mako'
- name: user_external_login_email_confirm_link
- __docs__: ...
+ subject: 'OSF Account Verification'
+ __docs__: 'Emails for first-time login through external identity providers.'
object_content_type_model_name: osfuser
template: 'website/templates/emails/external_confirm_link.html.mako'
- name: user_external_login_confirm_email_create
@@ -253,12 +258,14 @@ notification_types:
object_content_type_model_name: abstractnode
template: 'website/templates/emails/access_request_submitted.html.mako'
- name: node_fork_failed
+ subject: 'Your fork has failed'
__docs__: This email is sent when a fork fails to be created, this could be due to addons or network outages or
technical errors.
object_content_type_model_name: abstractnode
template: 'website/templates/emails/fork_failed.html.mako'
- name: node_fork_completed
- __docs__: This email is sent when a fork is successfully created,
+ subject: 'Your fork has completed'
+ __docs__: 'This email is sent when a fork is successfully created,'
object_content_type_model_name: abstractnode
template: 'website/templates/emails/fork_completed.html.mako'
- name: node_schema_response_initiated
diff --git a/osf/management/commands/check_crossref_dois.py b/osf/management/commands/check_crossref_dois.py
index bee66856747..bff7ca7e07f 100644
--- a/osf/management/commands/check_crossref_dois.py
+++ b/osf/management/commands/check_crossref_dois.py
@@ -3,6 +3,7 @@
import requests
import django
+from django.core.mail import send_mail
from django.core.management.base import BaseCommand
from django.utils import timezone
django.setup()
@@ -123,7 +124,7 @@ def report_stuck_dois(dry_run=True):
if preprints_with_pending_dois:
guids = ', '.join(preprints_with_pending_dois.values_list('guids___id', flat=True))
if not dry_run:
- mails.send_mail(
+ send_mail(
to_addr=settings.OSF_SUPPORT_EMAIL,
mail=mails.CROSSREF_DOIS_PENDING,
pending_doi_count=preprints_with_pending_dois.count(),
diff --git a/osf/management/commands/email_all_users.py b/osf/management/commands/email_all_users.py
index f5cbd677fb7..774f8b5af2d 100644
--- a/osf/management/commands/email_all_users.py
+++ b/osf/management/commands/email_all_users.py
@@ -6,6 +6,8 @@
import django
+from django.core.mail import send_mail
+
django.setup()
from django.core.management.base import BaseCommand
@@ -44,7 +46,7 @@ def email_all_users(email_template, dry_run=False, ids=None, start_id=0, offset=
for user in active_users.iterator():
logger.info(f'Sending email to {user.id}')
try:
- mails.send_mail(
+ send_mail(
to_addr=user.email,
mail=template,
given_name=user.given_name or user.fullname,
diff --git a/osf/management/commands/find_spammy_files.py b/osf/management/commands/find_spammy_files.py
index 33d25366ea1..7feeab508fa 100644
--- a/osf/management/commands/find_spammy_files.py
+++ b/osf/management/commands/find_spammy_files.py
@@ -3,6 +3,7 @@
from datetime import timedelta
import logging
+from django.core.mail import send_mail
from django.core.management.base import BaseCommand
from django.utils import timezone
@@ -52,7 +53,7 @@ def find_spammy_files(sniff_r=None, n=None, t=None, to_addrs=None):
if ct:
if to_addrs:
for addr in to_addrs:
- mails.send_mail(
+ send_mail(
mail=mails.SPAM_FILES_DETECTED,
to_addr=addr,
ct=ct,
diff --git a/osf/migrations/0033_delete_queuedmail.py b/osf/migrations/0033_delete_queuedmail.py
new file mode 100644
index 00000000000..febe0843df5
--- /dev/null
+++ b/osf/migrations/0033_delete_queuedmail.py
@@ -0,0 +1,16 @@
+# Generated by Django 4.2.13 on 2025-07-27 21:30
+
+from django.db import migrations
+
+
+class Migration(migrations.Migration):
+
+ dependencies = [
+ ('osf', '0032_alter_notificationsubscription_options_and_more'),
+ ]
+
+ operations = [
+ migrations.DeleteModel(
+ name='QueuedMail',
+ ),
+ ]
diff --git a/osf/models/__init__.py b/osf/models/__init__.py
index d09e350adfe..669059d9c4c 100644
--- a/osf/models/__init__.py
+++ b/osf/models/__init__.py
@@ -84,7 +84,6 @@
RegistrationProvider,
WhitelistedSHAREPreprintProvider,
)
-from .queued_mail import QueuedMail
from .registrations import (
DraftRegistration,
DraftRegistrationLog,
diff --git a/osf/models/institution.py b/osf/models/institution.py
index 5dce3c1df36..737233ca7b8 100644
--- a/osf/models/institution.py
+++ b/osf/models/institution.py
@@ -7,6 +7,7 @@
from django.conf import settings as django_conf_settings
from django.contrib.postgres import fields
+from django.core.mail import send_mail
from django.db import models
from django.db.models.signals import post_save
from django.dispatch import receiver
@@ -221,7 +222,7 @@ def _send_deactivation_email(self):
for user in self.get_institution_users():
try:
attempts += 1
- mails.send_mail(
+ send_mail(
to_addr=user.username,
mail=mails.INSTITUTION_DEACTIVATION,
user=user,
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 66e58281db4..34aee20a357 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -29,6 +29,7 @@ class Type(str, Enum):
ADDONS_BOA_JOB_FAILURE = 'addon_boa_job_failure'
ADDONS_BOA_JOB_COMPLETE = 'addon_boa_job_complete'
+ DESK_ARCHIVE_REGISTRATION_STUCK = 'desk_archive_registration_stuck'
DESK_REQUEST_EXPORT = 'desk_request_export'
DESK_REQUEST_DEACTIVATION = 'desk_request_deactivation'
DESK_OSF_SUPPORT_EMAIL = 'desk_osf_support_email'
diff --git a/osf/models/queued_mail.py b/osf/models/queued_mail.py
deleted file mode 100644
index 844465d5193..00000000000
--- a/osf/models/queued_mail.py
+++ /dev/null
@@ -1,162 +0,0 @@
-import waffle
-
-from django.db import models
-from django.utils import timezone
-
-from osf.utils.fields import NonNaiveDateTimeField
-from website.mails import Mail, send_mail
-from website.mails import presends
-from website import settings as osf_settings
-
-from osf import features
-from .base import BaseModel, ObjectIDMixin
-from osf.utils.datetime_aware_jsonfield import DateTimeAwareJSONField
-
-
-class QueuedMail(ObjectIDMixin, BaseModel):
- user = models.ForeignKey('OSFUser', db_index=True, null=True, on_delete=models.CASCADE)
- to_addr = models.CharField(max_length=255)
- send_at = NonNaiveDateTimeField(db_index=True, null=False)
-
- # string denoting the template, presend to be used. Has to be an index of queue_mail types
- email_type = models.CharField(max_length=255, db_index=True, null=False)
-
- # dictionary with variables used to populate mako template and store information used in presends
- # Example:
- # self.data = {
- # 'nid' : 'ShIpTo',
- # 'fullname': 'Florence Welch',
- #}
- data = DateTimeAwareJSONField(default=dict, blank=True)
- sent_at = NonNaiveDateTimeField(db_index=True, null=True, blank=True)
-
- def __repr__(self):
- if self.sent_at is not None:
- return ''.format(
- self._id, self.email_type, self.to_addr, self.sent_at
- )
- return ''.format(
- self._id, self.email_type, self.to_addr, self.send_at
- )
-
- def send_mail(self):
- """
- Grabs the data from this email, checks for user subscription to help mails,
-
- constructs the mail object and checks presend. Then attempts to send the email
- through send_mail()
- :return: boolean based on whether email was sent.
- """
- mail_struct = queue_mail_types[self.email_type]
- presend = mail_struct['presend'](self)
- mail = Mail(
- mail_struct['template'],
- subject=mail_struct['subject'],
- categories=mail_struct.get('categories', None)
- )
- self.data['osf_url'] = osf_settings.DOMAIN
- if presend and self.user.is_active and self.user.osf_mailing_lists.get(osf_settings.OSF_HELP_LIST):
- send_mail(self.to_addr or self.user.username, mail, **(self.data or {}))
- self.sent_at = timezone.now()
- self.save()
- return True
- else:
- self.__class__.delete(self)
- return False
-
- def find_sent_of_same_type_and_user(self):
- """
- Queries up for all emails of the same type as self, sent to the same user as self.
- Does not look for queue-up emails.
- :return: a list of those emails
- """
- return self.__class__.objects.filter(email_type=self.email_type, user=self.user).exclude(sent_at=None)
-
-
-def queue_mail(to_addr, mail, send_at, user, **context):
- """
- Queue an email to be sent using send_mail after a specified amount
- of time and if the presend returns True. The presend is attached to
- the template under mail.
-
- :param to_addr: the address email is to be sent to
- :param mail: the type of mail. Struct following template:
- { 'presend': function(),
- 'template': mako template name,
- 'subject': mail subject }
- :param send_at: datetime object of when to send mail
- :param user: user object attached to mail
- :param context: IMPORTANT kwargs to be attached to template.
- Sending mail will fail if needed for template kwargs are
- not parameters.
- :return: the QueuedMail object created
- """
- if waffle.switch_is_active(features.DISABLE_ENGAGEMENT_EMAILS) and mail.get('engagement', False):
- return False
- new_mail = QueuedMail(
- user=user,
- to_addr=to_addr,
- send_at=send_at,
- email_type=mail['template'],
- data=context
- )
- new_mail.save()
- return new_mail
-
-
-# Predefined email templates. Structure:
-#EMAIL_TYPE = {
-# 'template': the mako template used for email_type,
-# 'subject': subject used for the actual email,
-# 'categories': categories to attach to the email using Sendgrid's SMTPAPI.
-# 'engagement': Whether this is an engagement email that can be disabled with the disable_engagement_emails waffle flag
-# 'presend': predicate function that determines whether an email should be sent. May also
-# modify mail.data.
-#}
-
-NO_ADDON = {
- 'template': 'no_addon',
- 'subject': 'Link an add-on to your OSF project',
- 'presend': presends.no_addon,
- 'categories': ['engagement', 'engagement-no-addon'],
- 'engagement': True
-}
-
-NO_LOGIN = {
- 'template': 'no_login',
- 'subject': 'What you\'re missing on the OSF',
- 'presend': presends.no_login,
- 'categories': ['engagement', 'engagement-no-login'],
- 'engagement': True
-}
-
-NEW_PUBLIC_PROJECT = {
- 'template': 'new_public_project',
- 'subject': 'Now, public. Next, impact.',
- 'presend': presends.new_public_project,
- 'categories': ['engagement', 'engagement-new-public-project'],
- 'engagement': True
-}
-
-
-WELCOME_OSF4M = {
- 'template': 'welcome_osf4m',
- 'subject': 'The benefits of sharing your presentation',
- 'presend': presends.welcome_osf4m,
- 'categories': ['engagement', 'engagement-welcome-osf4m'],
- 'engagement': True
-}
-
-NO_ADDON_TYPE = 'no_addon'
-NO_LOGIN_TYPE = 'no_login'
-NEW_PUBLIC_PROJECT_TYPE = 'new_public_project'
-WELCOME_OSF4M_TYPE = 'welcome_osf4m'
-
-
-# Used to keep relationship from stored string 'email_type' to the predefined queued_email objects.
-queue_mail_types = {
- NO_ADDON_TYPE: NO_ADDON,
- NO_LOGIN_TYPE: NO_LOGIN,
- NEW_PUBLIC_PROJECT_TYPE: NEW_PUBLIC_PROJECT,
- WELCOME_OSF4M_TYPE: WELCOME_OSF4M
-}
diff --git a/scripts/send_queued_mails.py b/scripts/send_queued_mails.py
deleted file mode 100644
index 7c70c7685a0..00000000000
--- a/scripts/send_queued_mails.py
+++ /dev/null
@@ -1,66 +0,0 @@
-import logging
-
-import django
-from django.db import transaction
-from django.utils import timezone
-django.setup()
-
-from framework.celery_tasks import app as celery_app
-
-from osf.models.queued_mail import QueuedMail
-from website.app import init_app
-from website import settings
-
-from scripts.utils import add_file_logger
-
-
-logger = logging.getLogger(__name__)
-logging.basicConfig(level=logging.INFO)
-
-
-def main(dry_run=True):
- # find all emails to be sent, pops the top one for each user(to obey the once
- # a week requirement), checks to see if one has been sent this week, and if
- # not send the email, otherwise leave it in the queue
-
- user_queue = {}
- for email in find_queued_mails_ready_to_be_sent():
- user_queue.setdefault(email.user._id, []).append(email)
-
- emails_to_be_sent = pop_and_verify_mails_for_each_user(user_queue)
-
- logger.info(f'Emails being sent at {timezone.now().isoformat()}')
-
- for mail in emails_to_be_sent:
- if not dry_run:
- with transaction.atomic():
- try:
- sent_ = mail.send_mail()
- message = f'Email of type {mail.email_type} sent to {mail.to_addr}' if sent_ else \
- f'Email of type {mail.email_type} failed to be sent to {mail.to_addr}'
- logger.info(message)
- except Exception as error:
- logger.error(f'Email of type {mail.email_type} to be sent to {mail.to_addr} caused an ERROR')
- logger.exception(error)
- pass
- else:
- logger.info(f'Email of type {mail.email_type} will be sent to {mail.to_addr}')
-
-
-def find_queued_mails_ready_to_be_sent():
- return QueuedMail.objects.filter(send_at__lt=timezone.now(), sent_at__isnull=True)
-
-def pop_and_verify_mails_for_each_user(user_queue):
- for user_emails in user_queue.values():
- mail = user_emails[0]
- mails_past_week = mail.user.queuedmail_set.filter(sent_at__gt=timezone.now() - settings.WAIT_BETWEEN_MAILS)
- if not mails_past_week.count():
- yield mail
-
-
-@celery_app.task(name='scripts.send_queued_mails')
-def run_main(dry_run=True):
- init_app(routes=False)
- if not dry_run:
- add_file_logger(logger, __file__)
- main(dry_run=dry_run)
diff --git a/scripts/stuck_registration_audit.py b/scripts/stuck_registration_audit.py
index b5445873faf..07a5d9a68c6 100644
--- a/scripts/stuck_registration_audit.py
+++ b/scripts/stuck_registration_audit.py
@@ -15,7 +15,7 @@
from framework.auth import Auth
from framework.celery_tasks import app as celery_app
from osf.management.commands import force_archive as fa
-from osf.models import ArchiveJob, Registration
+from osf.models import ArchiveJob, Registration, NotificationType
from website.archiver import ARCHIVER_INITIATED
from website.settings import ARCHIVE_TIMEOUT_TIMEDELTA, ADDONS_REQUESTED
@@ -97,13 +97,16 @@ def main():
dict_writer.writeheader()
dict_writer.writerows(broken_registrations)
- mails.send_mail(
- mail=mails.ARCHIVE_REGISTRATION_STUCK_DESK,
- to_addr=settings.OSF_SUPPORT_EMAIL,
- broken_registrations=broken_registrations,
- attachment_name=filename,
- attachment_content=output.getvalue(),
- can_change_preferences=False,
+ NotificationType.objects.get(
+ name=NotificationType.Type.DESK_ARCHIVE_REGISTRATION_STUCK
+ ).emit(
+ destination_address=settings.OSF_SUPPORT_EMAIL,
+ event_context={
+ 'broken_registrations': broken_registrations,
+ 'attachment_name': filename,
+ 'attachement_content': output.getvalue(),
+ 'can_change_preferences': False
+ }
)
logger.info(f'{len(broken_registrations)} broken registrations found')
diff --git a/scripts/tests/test_triggered_mails.py b/scripts/tests/test_triggered_mails.py
deleted file mode 100644
index b0b94a7f7c5..00000000000
--- a/scripts/tests/test_triggered_mails.py
+++ /dev/null
@@ -1,56 +0,0 @@
-from unittest import mock
-from datetime import timedelta
-
-from django.utils import timezone
-
-from tests.base import OsfTestCase
-from osf_tests.factories import UserFactory
-
-from scripts.triggered_mails import main, find_inactive_users_with_no_inactivity_email_sent_or_queued
-from website import mails
-
-
-class TestTriggeredMails(OsfTestCase):
-
- def setUp(self):
- super().setUp()
- self.user = UserFactory()
- self.user.date_last_login = timezone.now()
- self.user.save()
-
- @mock.patch('website.mails.queue_mail')
- def test_dont_trigger_no_login_mail(self, mock_queue):
- self.user.date_last_login = timezone.now() - timedelta(seconds=6)
- self.user.save()
- main(dry_run=False)
- assert not mock_queue.called
-
- @mock.patch('website.mails.queue_mail')
- def test_trigger_no_login_mail(self, mock_queue):
- self.user.date_last_login = timezone.now() - timedelta(weeks=6)
- self.user.save()
- main(dry_run=False)
- mock_queue.assert_called_with(
- user=mock.ANY,
- fullname=self.user.fullname,
- to_addr=self.user.username,
- mail={'callback': mock.ANY, 'template': 'no_login', 'subject': mock.ANY},
- send_at=mock.ANY,
- )
-
- def test_find_inactive_users_with_no_inactivity_email_sent_or_queued(self):
- user_active = UserFactory(fullname='Spot')
- user_inactive = UserFactory(fullname='Nucha')
- user_already_received_mail = UserFactory(fullname='Pep')
- user_active.date_last_login = timezone.now() - timedelta(seconds=6)
- user_inactive.date_last_login = timezone.now() - timedelta(weeks=6)
- user_already_received_mail.date_last_login = timezone.now() - timedelta(weeks=6)
- user_active.save()
- user_inactive.save()
- user_already_received_mail.save()
- mails.queue_mail(to_addr=user_already_received_mail.username,
- send_at=timezone.now(),
- user=user_already_received_mail,
- mail=mails.NO_LOGIN)
- users = find_inactive_users_with_no_inactivity_email_sent_or_queued()
- assert len(users) == 1
diff --git a/scripts/triggered_mails.py b/scripts/triggered_mails.py
deleted file mode 100644
index 3e0c4fea73a..00000000000
--- a/scripts/triggered_mails.py
+++ /dev/null
@@ -1,50 +0,0 @@
-import logging
-
-from django.db import transaction
-from django.db.models import Q
-from django.utils import timezone
-
-from framework.celery_tasks import app as celery_app
-from osf.models import OSFUser
-from osf.models.queued_mail import NO_LOGIN_TYPE, NO_LOGIN, QueuedMail, queue_mail
-from website.app import init_app
-from website import settings
-
-from scripts.utils import add_file_logger
-
-logger = logging.getLogger(__name__)
-logging.basicConfig(level=logging.INFO)
-
-
-def main(dry_run=True):
- for user in find_inactive_users_with_no_inactivity_email_sent_or_queued():
- if dry_run:
- logger.warning('Dry run mode')
- logger.warning(f'Email of type no_login queued to {user.username}')
- if not dry_run:
- with transaction.atomic():
- queue_mail(
- to_addr=user.username,
- mail=NO_LOGIN,
- send_at=timezone.now(),
- user=user,
- fullname=user.fullname,
- osf_support_email=settings.OSF_SUPPORT_EMAIL,
- )
-
-
-def find_inactive_users_with_no_inactivity_email_sent_or_queued():
- users_sent_ids = QueuedMail.objects.filter(email_type=NO_LOGIN_TYPE).values_list('user__guids___id')
- return (OSFUser.objects
- .filter(
- (Q(date_last_login__lt=timezone.now() - settings.NO_LOGIN_WAIT_TIME) & ~Q(tags__name='osf4m')) |
- Q(date_last_login__lt=timezone.now() - settings.NO_LOGIN_OSF4M_WAIT_TIME, tags__name='osf4m'),
- is_active=True)
- .exclude(guids___id__in=users_sent_ids))
-
-@celery_app.task(name='scripts.triggered_mails')
-def run_main(dry_run=True):
- init_app(routes=False)
- if not dry_run:
- add_file_logger(logger, __file__)
- main(dry_run=dry_run)
diff --git a/website/app.py b/website/app.py
index 5db655a2164..7d9842348e4 100644
--- a/website/app.py
+++ b/website/app.py
@@ -19,7 +19,6 @@
from framework.transactions import handlers as transaction_handlers
# Imports necessary to connect signals
from website.archiver import listeners # noqa
-from website.mails import listeners # noqa
from website.notifications import listeners # noqa
from website.identifiers import listeners # noqa
from website.reviews import listeners # noqa
diff --git a/website/archiver/utils.py b/website/archiver/utils.py
index 72bffee47f8..9768c43a894 100644
--- a/website/archiver/utils.py
+++ b/website/archiver/utils.py
@@ -5,7 +5,6 @@
from django.db.models import CharField, OuterRef, Subquery
from framework.auth import Auth
from framework.utils import sanitize_html
-from osf.models.notification_type import NotificationType
from website import settings
from website.archiver import (
@@ -27,6 +26,8 @@ def normalize_unicode_filenames(filename):
def send_archiver_size_exceeded_mails(src, user, stat_result, url):
+ from osf.models.notification_type import NotificationType
+
NotificationType.objects.get(
name=NotificationType.Type.DESK_ARCHIVE_JOB_EXCEEDED
).emit(
@@ -51,6 +52,8 @@ def send_archiver_size_exceeded_mails(src, user, stat_result, url):
def send_archiver_copy_error_mails(src, user, results, url):
+ from osf.models.notification_type import NotificationType
+
NotificationType.objects.get(
name=NotificationType.Type.DESK_ARCHIVE_JOB_COPY_ERROR
).emit(
@@ -76,6 +79,8 @@ def send_archiver_copy_error_mails(src, user, results, url):
)
def send_archiver_file_not_found_mails(src, user, results, url):
+ from osf.models.notification_type import NotificationType
+
NotificationType.objects.get(
name=NotificationType.Type.DESK_ARCHIVE_JOB_FILE_NOT_FOUND
).emit(
@@ -100,6 +105,8 @@ def send_archiver_file_not_found_mails(src, user, results, url):
)
def send_archiver_uncaught_error_mails(src, user, results, url):
+ from osf.models.notification_type import NotificationType
+
NotificationType.objects.get(
name=NotificationType.Type.DESK_ARCHIVE_JOB_UNCAUGHT_ERROR
).emit(
diff --git a/website/conferences/views.py b/website/conferences/views.py
index cf7dbfd6d3b..1460d4dd78e 100644
--- a/website/conferences/views.py
+++ b/website/conferences/views.py
@@ -1,13 +1,11 @@
from rest_framework import status as http_status
import logging
-from flask import request
-import waffle
from django.db import transaction, connection
from django.contrib.contenttypes.models import ContentType
from framework.auth import get_or_create_user
-from framework.exceptions import HTTPError, ServiceDiscontinuedError
+from framework.exceptions import HTTPError
from framework.flask import redirect
from framework.transactions.handlers import no_auto_transaction
from osf import features
@@ -16,8 +14,6 @@
from website.conferences import utils
from website.conferences.message import ConferenceMessage, ConferenceError
from website.ember_osf_web.decorators import ember_flag_is_active
-from website.mails import CONFERENCE_SUBMITTED, CONFERENCE_INACTIVE, CONFERENCE_FAILED, CONFERENCE_DEPRECATION
-from website.mails import send_mail
from website.util import web_url_for
from website.util.metrics import CampaignSourceTags
@@ -30,17 +26,6 @@ def meeting_hook():
"""
message = ConferenceMessage()
- if waffle.flag_is_active(request, features.DISABLE_MEETINGS):
- send_mail(
- message.sender_email,
- CONFERENCE_DEPRECATION,
- fullname=message.sender_display,
- support_email=settings.OSF_SUPPORT_EMAIL,
- can_change_preferences=False,
- logo=settings.OSF_MEETINGS_LOGO,
- )
- raise ServiceDiscontinuedError()
-
try:
message.verify()
except ConferenceError as error:
@@ -54,14 +39,6 @@ def meeting_hook():
raise HTTPError(http_status.HTTP_406_NOT_ACCEPTABLE)
if not conference.active:
- send_mail(
- message.sender_email,
- CONFERENCE_INACTIVE,
- fullname=message.sender_display,
- presentations_url=web_url_for('conference_view', _absolute=True),
- can_change_preferences=False,
- logo=settings.OSF_MEETINGS_LOGO,
- )
raise HTTPError(http_status.HTTP_406_NOT_ACCEPTABLE)
add_poster_by_email(conference=conference, message=message)
@@ -72,16 +49,6 @@ def add_poster_by_email(conference, message):
:param Conference conference:
:param ConferenceMessage message:
"""
- # Fail if no attachments
- if not message.attachments:
- return send_mail(
- message.sender_email,
- CONFERENCE_FAILED,
- fullname=message.sender_display,
- can_change_preferences=False,
- logo=settings.OSF_MEETINGS_LOGO
- )
-
with transaction.atomic():
user, user_created = get_or_create_user(
message.sender_display,
@@ -97,16 +64,6 @@ def add_poster_by_email(conference, message):
user.update_date_last_login()
user.save()
- # must save the user first before accessing user._id
- set_password_url = web_url_for(
- 'reset_password_get',
- uid=user._id,
- token=user.verification_key_v2['token'],
- _absolute=True,
- )
- else:
- set_password_url = None
-
# Always create a new meeting node
node = Node.objects.create(
title=message.subject,
@@ -125,35 +82,6 @@ def add_poster_by_email(conference, message):
utils.upload_attachments(user, node, message.attachments)
- download_url = node.web_url_for(
- 'addon_view_or_download_file',
- path=message.attachments[0].filename,
- provider='osfstorage',
- action='download',
- _absolute=True,
- )
-
- # Send confirmation email
- send_mail(
- message.sender_email,
- CONFERENCE_SUBMITTED,
- conf_full_name=conference.name,
- conf_view_url=web_url_for(
- 'conference_results',
- meeting=message.conference_name,
- _absolute=True,
- ),
- fullname=message.sender_display,
- user_created=user_created,
- set_password_url=set_password_url,
- profile_url=user.absolute_url,
- node_url=node.absolute_url,
- file_url=download_url,
- presentation_type=message.conference_category.lower(),
- is_spam=message.is_spam,
- can_change_preferences=False,
- logo=settings.OSF_MEETINGS_LOGO
- )
def conference_data(meeting):
try:
diff --git a/website/mails/listeners.py b/website/mails/listeners.py
deleted file mode 100644
index 3f411d52f87..00000000000
--- a/website/mails/listeners.py
+++ /dev/null
@@ -1,44 +0,0 @@
-"""Functions that listen for event signals and queue up emails.
-All triggered emails live here.
-"""
-
-from django.utils import timezone
-
-from website import settings
-from framework.auth import signals as auth_signals
-from website.project import signals as project_signals
-
-
-@auth_signals.unconfirmed_user_created.connect
-def queue_no_addon_email(user):
- """Queue an email for user who has not connected an addon after
- `settings.NO_ADDON_WAIT_TIME` months of signing up for the OSF.
- """
- from osf.models.queued_mail import queue_mail, NO_ADDON
- queue_mail(
- to_addr=user.username,
- mail=NO_ADDON,
- send_at=timezone.now() + settings.NO_ADDON_WAIT_TIME,
- user=user,
- fullname=user.fullname
- )
-
-@project_signals.privacy_set_public.connect
-def queue_first_public_project_email(user, node, meeting_creation):
- """Queue and email after user has made their first
- non-OSF4M project public.
- """
- from osf.models.queued_mail import queue_mail, QueuedMail, NEW_PUBLIC_PROJECT_TYPE, NEW_PUBLIC_PROJECT
- if not meeting_creation:
- sent_mail = QueuedMail.objects.filter(user=user, email_type=NEW_PUBLIC_PROJECT_TYPE)
- if not sent_mail.exists():
- queue_mail(
- to_addr=user.username,
- mail=NEW_PUBLIC_PROJECT,
- send_at=timezone.now() + settings.NEW_PUBLIC_PROJECT_WAIT_TIME,
- user=user,
- nid=node._id,
- fullname=user.fullname,
- project_title=node.title,
- osf_support_email=settings.OSF_SUPPORT_EMAIL,
- )
diff --git a/website/mails/mails.py b/website/mails/mails.py
index 84e82d4632a..83ab3afc613 100644
--- a/website/mails/mails.py
+++ b/website/mails/mails.py
@@ -19,14 +19,10 @@
"""
import os
import logging
-import waffle
from mako.lookup import TemplateLookup, Template
-from framework.email import tasks
-from osf import features
from website import settings
-from django.core.mail import EmailMessage, get_connection
logger = logging.getLogger(__name__)
@@ -38,11 +34,6 @@
HTML_EXT = '.html.mako'
-DISABLED_MAILS = [
- 'welcome',
- 'welcome_osf4i'
-]
-
class Mail:
"""An email object.
@@ -75,119 +66,6 @@ def render_message(tpl_name, **context):
tpl = _tpl_lookup.get_template(tpl_name)
return tpl.render(**context)
-
-def send_to_mailhog(subject, message, from_email, to_email, attachment_name=None, attachment_content=None):
- email = EmailMessage(
- subject=subject,
- body=message,
- from_email=from_email,
- to=[to_email],
- connection=get_connection(
- backend='django.core.mail.backends.smtp.EmailBackend',
- host=settings.MAILHOG_HOST,
- port=settings.MAILHOG_PORT,
- username='',
- password='',
- use_tls=False,
- use_ssl=False,
- )
- )
- email.content_subtype = 'html'
-
- if attachment_name and attachment_content:
- email.attach(attachment_name, attachment_content)
-
- try:
- email.send()
- except ConnectionRefusedError:
- logger.debug('Mailhog is not running. Please start it to send emails.')
- return
-
-
-def send_mail(
- to_addr,
- mail,
- from_addr=None,
- bcc_addr=None,
- reply_to=None,
- mailer=None,
- celery=True,
- username=None,
- password=None,
- callback=None,
- attachment_name=None,
- attachment_content=None,
- **context):
- """
- Send an email from the OSF.
- Example:
- from website import mails
-
- mails.send_email('foo@bar.com', mails.TEST, name="Foo")
-
- :param str to_addr: The recipient's email address
- :param str bcc_addr: The BCC senders's email address (or list of addresses)
- :param str reply_to: The sender's email address will appear in the reply-to header
- :param Mail mail: The mail object
- :param str mimetype: Either 'plain' or 'html'
- :param function callback: celery task to execute after send_mail completes
- :param **context: Context vars for the message template
-
- .. note:
- Uses celery if available
- """
- if waffle.switch_is_active(features.DISABLE_ENGAGEMENT_EMAILS) and mail.engagement:
- return False
-
- from_addr = from_addr or settings.FROM_EMAIL
- mailer = mailer or tasks.send_email
- subject = mail.subject(**context)
- message = mail.html(**context)
- # Don't use ttls and login in DEBUG_MODE
- ttls = login = not settings.DEBUG_MODE
- logger.debug('Sending email...')
- logger.debug(f'To: {to_addr}\nFrom: {from_addr}\nSubject: {subject}\nMessage: {message}')
-
- if waffle.switch_is_active(features.ENABLE_MAILHOG):
- logger.debug('Intercepting email: sending via MailHog')
- send_to_mailhog(
- subject=subject,
- message=message,
- from_email=from_addr,
- to_email=to_addr,
- attachment_name=attachment_name,
- attachment_content=attachment_content
- )
-
- kwargs = dict(
- from_addr=from_addr,
- to_addr=to_addr,
- subject=subject,
- message=message,
- ttls=ttls,
- login=login,
- username=username,
- password=password,
- categories=mail.categories,
- attachment_name=attachment_name,
- attachment_content=attachment_content,
- bcc_addr=bcc_addr,
- reply_to=reply_to,
- )
-
- logger.debug('Preparing to send...')
- if settings.USE_CELERY and celery:
- logger.debug('Sending via celery...')
- return mailer.apply_async(kwargs=kwargs, link=callback)
- else:
- logger.debug('Sending without celery')
- ret = mailer(**kwargs)
- if callback:
- callback()
-
- return ret
-
-
def get_english_article(word):
"""
Decide whether to use 'a' or 'an' for a given English word.
@@ -199,51 +77,10 @@ def get_english_article(word):
# Predefined Emails
-
-TEST = Mail('test', subject='A test email to ${name}', categories=['test'])
-
-# Emails for first-time login through external identity providers.
-EXTERNAL_LOGIN_CONFIRM_EMAIL_CREATE = Mail(
- 'external_confirm_create',
- subject='OSF Account Verification'
-)
-
-FORK_COMPLETED = Mail(
- 'fork_completed',
- subject='Your fork has completed'
-)
-
-FORK_FAILED = Mail(
- 'fork_failed',
- subject='Your fork has failed'
-)
-
-EXTERNAL_LOGIN_CONFIRM_EMAIL_LINK = Mail(
- 'external_confirm_link',
- subject='OSF Account Verification'
-)
-EXTERNAL_LOGIN_LINK_SUCCESS = Mail(
- 'external_confirm_success',
- subject='OSF Account Verification Success'
-)
-
-# Sign up confirmation emails for OSF, native campaigns and branded campaigns
-INITIAL_CONFIRM_EMAIL = Mail(
- 'initial_confirm',
- subject='OSF Account Verification'
-)
-CONFIRM_EMAIL = Mail(
- 'confirm',
- subject='Add a new email to your OSF account'
-)
CONFIRM_EMAIL_ERPC = Mail(
'confirm_erpc',
subject='OSF Account Verification, Election Research Preacceptance Competition'
)
-CONFIRM_EMAIL_AGU_CONFERENCE_2023 = Mail(
- 'confirm_agu_conference_2023',
- subject='OSF Account Verification, from the American Geophysical Union Conference'
-)
CONFIRM_EMAIL_AGU_CONFERENCE = Mail(
'confirm_agu_conference',
subject='OSF Account Verification, from the American Geophysical Union Conference'
@@ -340,15 +177,6 @@ def get_english_article(word):
'contributor_added_access_request',
subject='Your access request to an OSF project has been approved'
)
-FORWARD_INVITE = Mail('forward_invite', subject='Please forward to ${fullname}')
-FORWARD_INVITE_REGISTERED = Mail('forward_invite_registered', subject='Please forward to ${fullname}')
-
-FORGOT_PASSWORD = Mail('forgot_password', subject='Reset Password')
-FORGOT_PASSWORD_INSTITUTION = Mail('forgot_password_institution', subject='Set Password')
-PASSWORD_RESET = Mail('password_reset', subject='Your OSF password has been reset')
-PENDING_VERIFICATION = Mail('pending_invite', subject='Your account is almost ready!')
-PENDING_VERIFICATION_REGISTERED = Mail('pending_registered', subject='Received request to be a contributor')
-
REQUEST_EXPORT = Mail('support_request', subject='[via OSF] Export Request')
REQUEST_DEACTIVATION = Mail('support_request', subject='[via OSF] Deactivation Request')
@@ -360,22 +188,6 @@ def get_english_article(word):
subject='[auto] Spam files audit'
)
-CONFERENCE_SUBMITTED = Mail(
- 'conference_submitted',
- subject='Project created on OSF',
-)
-CONFERENCE_INACTIVE = Mail(
- 'conference_inactive',
- subject='OSF Error: Conference inactive',
-)
-CONFERENCE_FAILED = Mail(
- 'conference_failed',
- subject='OSF Error: No files attached',
-)
-CONFERENCE_DEPRECATION = Mail(
- 'conference_deprecation',
- subject='Meeting Service Discontinued',
-)
DIGEST = Mail(
'digest', subject='OSF Notifications',
@@ -387,11 +199,6 @@ def get_english_article(word):
subject='Recent submissions to ${provider_name}',
)
-TRANSACTIONAL = Mail(
- 'transactional', subject='OSF: ${subject}',
- categories=['notifications', 'notifications-transactional']
-)
-
# Retraction related Mail objects
PENDING_RETRACTION_ADMIN = Mail(
'pending_retraction_admin',
diff --git a/website/mails/presends.py b/website/mails/presends.py
deleted file mode 100644
index 3a3175c99ee..00000000000
--- a/website/mails/presends.py
+++ /dev/null
@@ -1,55 +0,0 @@
-from django.utils import timezone
-
-from website import settings
-
-def no_addon(email):
- return len([addon for addon in email.user.get_addons() if addon.config.short_name != 'osfstorage']) == 0
-
-def no_login(email):
- from osf.models.queued_mail import QueuedMail, NO_LOGIN_TYPE
- sent = QueuedMail.objects.filter(user=email.user, email_type=NO_LOGIN_TYPE).exclude(_id=email._id)
- if sent.exists():
- return False
- return email.user.date_last_login < timezone.now() - settings.NO_LOGIN_WAIT_TIME
-
-def new_public_project(email):
- """ Will check to make sure the project that triggered this presend is still public
- before sending the email. It also checks to make sure this is the first (and only)
- new public project email to be sent
-
- :param email: QueuedMail object, with 'nid' in its data field
- :return: boolean based on whether the email should be sent
- """
-
- # In line import to prevent circular importing
- from osf.models import AbstractNode
-
- node = AbstractNode.load(email.data['nid'])
-
- if not node:
- return False
- public = email.find_sent_of_same_type_and_user()
- return node.is_public and not len(public)
-
-
-def welcome_osf4m(email):
- """ presend has two functions. First is to make sure that the user has not
- converted to a regular OSF user by logging in. Second is to populate the
- data field with downloads by finding the file/project (node_settings) and
- counting downloads of all files within that project
-
- :param email: QueuedMail object with data field including fid
- :return: boolean based on whether the email should be sent
- """
- # In line import to prevent circular importing
- from addons.osfstorage.models import OsfStorageFileNode
- if email.user.date_last_login:
- if email.user.date_last_login > timezone.now() - settings.WELCOME_OSF4M_WAIT_TIME_GRACE:
- return False
- upload = OsfStorageFileNode.load(email.data['fid'])
- if upload:
- email.data['downloads'] = upload.get_download_count()
- else:
- email.data['downloads'] = 0
- email.save()
- return True
diff --git a/website/notifications/constants.py b/website/notifications/constants.py
index 66bb575b765..35e3559d252 100644
--- a/website/notifications/constants.py
+++ b/website/notifications/constants.py
@@ -1,17 +1,14 @@
-NODE_SUBSCRIPTIONS_AVAILABLE = {
- 'node_file_updated': 'Files updated'
-}
# Note: if the subscription starts with 'global_', it will be treated like a default
# subscription. If no notification type has been assigned, the user subscription
# will default to 'email_transactional'.
-USER_SUBSCRIPTIONS_AVAILABLE = {
- 'global_file_updated': 'Files updated',
- 'global_reviews': 'Preprint submissions updated'
-}
+USER_SUBSCRIPTIONS_AVAILABLE = [
+ 'user_file_updated',
+ 'user_reviews'
+]
PROVIDER_SUBSCRIPTIONS_AVAILABLE = {
- 'new_pending_submissions': 'New preprint submissions for moderators to review.'
+ 'provider_new_pending_submissions': 'New preprint submissions for moderators to review.'
}
# Note: the python value None mean inherit from parent
diff --git a/website/notifications/listeners.py b/website/notifications/listeners.py
index 21aed1df9e3..ed9a936492f 100644
--- a/website/notifications/listeners.py
+++ b/website/notifications/listeners.py
@@ -1,6 +1,4 @@
import logging
-from website.notifications.exceptions import InvalidSubscriptionError
-from website.notifications.utils import subscribe_user_to_notifications, subscribe_user_to_global_notifications
from website.project.signals import contributor_added, project_created
from framework.auth.signals import user_confirmed
@@ -10,25 +8,15 @@
def subscribe_creator(node):
if node.is_collection or node.is_deleted:
return None
- try:
- subscribe_user_to_notifications(node, node.creator)
- except InvalidSubscriptionError as err:
- user = node.creator._id if node.creator else 'None'
- logger.warning(f'Skipping subscription of user {user} to node {node._id}')
- logger.warning(f'Reason: {str(err)}')
+ from website.notifications.utils import subscribe_user_to_notifications
+ subscribe_user_to_notifications(node, node.creator)
@contributor_added.connect
def subscribe_contributor(node, contributor, auth=None, *args, **kwargs):
- try:
- subscribe_user_to_notifications(node, contributor)
- except InvalidSubscriptionError as err:
- logger.warning(f'Skipping subscription of user {contributor} to node {node._id}')
- logger.warning(f'Reason: {str(err)}')
+ from website.notifications.utils import subscribe_user_to_notifications
+ subscribe_user_to_notifications(node, contributor)
@user_confirmed.connect
def subscribe_confirmed_user(user):
- try:
- subscribe_user_to_global_notifications(user)
- except InvalidSubscriptionError as err:
- logger.warning(f'Skipping subscription of user {user} to global subscriptions')
- logger.warning(f'Reason: {str(err)}')
+ from website.notifications.utils import subscribe_user_to_global_notifications
+ subscribe_user_to_global_notifications(user)
diff --git a/website/notifications/utils.py b/website/notifications/utils.py
index 38707ac24a6..d9ceadfc39b 100644
--- a/website/notifications/utils.py
+++ b/website/notifications/utils.py
@@ -40,8 +40,10 @@ def find_subscription_type(subscription):
"""Find subscription type string within specific subscription.
Essentially removes extraneous parts of the string to get the type.
"""
- subs_available = list(constants.USER_SUBSCRIPTIONS_AVAILABLE.keys())
- subs_available.extend(list(constants.NODE_SUBSCRIPTIONS_AVAILABLE.keys()))
+ subs_available = constants.USER_SUBSCRIPTIONS_AVAILABLE
+ subs_available.extend(list({
+ 'node_file_updated': 'Files updated'
+ }.keys()))
for available in subs_available:
if available in subscription:
return available
@@ -279,7 +281,7 @@ def format_data(user, nodes):
# user is contributor on a component of the project/node
if can_read:
- node_sub_available = list(constants.NODE_SUBSCRIPTIONS_AVAILABLE.keys())
+ node_sub_available = ['node_file_updated']
subscriptions = get_all_node_subscriptions(user, node, user_subscriptions=user_subscriptions).filter(event_name__in=node_sub_available)
for subscription in subscriptions:
@@ -314,7 +316,7 @@ def format_data(user, nodes):
def format_user_subscriptions(user):
""" Format user-level subscriptions (e.g. comment replies across the OSF) for user settings page"""
- user_subs_available = list(constants.USER_SUBSCRIPTIONS_AVAILABLE.keys())
+ user_subs_available = constants.USER_SUBSCRIPTIONS_AVAILABLE
subscriptions = [
serialize_event(
user, subscription,
@@ -338,8 +340,8 @@ def format_file_subscription(user, node_id, path, provider):
return serialize_event(user, node=node, event_description='file_updated')
-all_subs = constants.NODE_SUBSCRIPTIONS_AVAILABLE.copy()
-all_subs.update(constants.USER_SUBSCRIPTIONS_AVAILABLE)
+all_subs = ['node_file_updated']
+all_subs += constants.USER_SUBSCRIPTIONS_AVAILABLE
def serialize_event(user, subscription=None, node=None, event_description=None):
"""
@@ -464,10 +466,8 @@ def subscribe_user_to_notifications(node, user):
if getattr(node, 'is_registration', False):
raise InvalidSubscriptionError('Registrations are invalid targets for subscriptions')
- events = constants.NODE_SUBSCRIPTIONS_AVAILABLE
-
if user.is_registered:
- for event in events:
+ for event in ['node_file_updated',]:
subscription, _ = NotificationSubscription.objects.get_or_create(
user=user,
notification_type__name=event
diff --git a/website/reviews/listeners.py b/website/reviews/listeners.py
index a48d601e071..6fa873e53a9 100644
--- a/website/reviews/listeners.py
+++ b/website/reviews/listeners.py
@@ -1,6 +1,3 @@
-from django.contrib.contenttypes.models import ContentType
-from website.profile.utils import get_profile_image_url
-from osf.models import NotificationSubscription, NotificationType
from website.settings import DOMAIN
from website.reviews import signals as reviews_signals
@@ -9,12 +6,15 @@
def reviews_withdraw_requests_notification_moderators(self, timestamp, context, user, resource):
context['referrer_fullname'] = user.fullname
provider = resource.provider
+ from django.contrib.contenttypes.models import ContentType
+ from osf.models import NotificationSubscription, NotificationType
provider_subscription, _ = NotificationSubscription.objects.get_or_create(
notification_type__name=NotificationType.Type.PROVIDER_REVIEWS_WITHDRAWAL_REQUESTED,
object_id=provider.id,
content_type=ContentType.objects.get_for_model(provider.__class__),
)
+ from website.profile.utils import get_profile_image_url
context['message'] = f'has requested withdrawal of "{resource.title}".'
context['profile_image_url'] = get_profile_image_url(user)
@@ -33,12 +33,15 @@ def reviews_withdraw_requests_notification_moderators(self, timestamp, context,
def reviews_withdrawal_requests_notification(self, timestamp, context):
preprint = context['reviewable']
preprint_word = preprint.provider.preprint_word
+ from django.contrib.contenttypes.models import ContentType
+ from osf.models import NotificationSubscription, NotificationType
provider_subscription, _ = NotificationSubscription.objects.get_or_create(
notification_type__name=NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS,
object_id=preprint.provider.id,
content_type=ContentType.objects.get_for_model(preprint.provider.__class__),
)
+ from website.profile.utils import get_profile_image_url
context['message'] = f'has requested withdrawal of the {preprint_word} "{preprint.title}".'
context['profile_image_url'] = get_profile_image_url(context['requester'])
From 3d80365bc41719310adaa6cf0cee43d2784cdc46 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Sun, 27 Jul 2025 18:04:22 -0400
Subject: [PATCH 125/336] improve automatic subscription methods
---
addons/base/views.py | 9 ++-
api/preprints/serializers.py | 2 +-
api_tests/mailhog/test_mailhog.py | 8 +-
notifications.yaml | 24 +++++-
...ion_provider_notification_subscriptions.py | 40 ----------
...ion_provider_notification_subscriptions.py | 45 -----------
osf/models/mixins.py | 6 +-
osf/models/node.py | 26 +++++--
osf/models/notification.py | 2 +-
osf/models/provider.py | 21 ++++--
osf/models/registrations.py | 3 +-
osf_tests/test_collection_submission.py | 6 --
osf_tests/test_schema_responses.py | 17 ++---
osf_tests/utils.py | 14 ++--
tests/test_events.py | 43 +++++------
website/mails/mails.py | 14 ----
website/notifications/emails.py | 28 +------
website/notifications/events/files.py | 20 ++++-
website/notifications/listeners.py | 28 +++++--
website/notifications/utils.py | 74 +++++--------------
website/project/views/contributor.py | 12 +--
21 files changed, 173 insertions(+), 269 deletions(-)
delete mode 100644 osf/management/commands/populate_collection_provider_notification_subscriptions.py
delete mode 100644 osf/management/commands/populate_registration_provider_notification_subscriptions.py
diff --git a/addons/base/views.py b/addons/base/views.py
index c621658287c..4547112e44b 100644
--- a/addons/base/views.py
+++ b/addons/base/views.py
@@ -582,7 +582,7 @@ def create_waterbutler_log(payload, **kwargs):
if payload.get('errors'):
notification_type = NotificationType.Type.FILE_OPERATION_FAILED
- NotificationType.objects.get(name=notification_type.value).emit(
+ NotificationType.objects.get(name=notification_type).emit(
user=user,
event_context={
'action': payload['action'],
@@ -608,7 +608,12 @@ def create_waterbutler_log(payload, **kwargs):
if target_node and payload['action'] != 'download_file':
update_storage_usage_with_size(payload)
- file_signals.file_updated.send(target=node, user=user, event_type=action, payload=payload)
+ file_signals.file_updated.send(
+ target=node,
+ user=user,
+ event_type=action,
+ payload=payload
+ )
match f'node_{action}':
case NotificationType.Type.NODE_FILE_ADDED:
diff --git a/api/preprints/serializers.py b/api/preprints/serializers.py
index d22bb00ab81..b5cdd0f1f93 100644
--- a/api/preprints/serializers.py
+++ b/api/preprints/serializers.py
@@ -468,7 +468,7 @@ def update(self, preprint, validated_data):
preprint,
contributor=author,
auth=auth,
- email_template='preprint',
+ notification_type='preprint',
)
return preprint
diff --git a/api_tests/mailhog/test_mailhog.py b/api_tests/mailhog/test_mailhog.py
index 997947f9588..573df3a4fbe 100644
--- a/api_tests/mailhog/test_mailhog.py
+++ b/api_tests/mailhog/test_mailhog.py
@@ -1,7 +1,6 @@
import requests
import pytest
from django.core.mail import send_mail
-from website.mails import TEST
from waffle.testutils import override_switch
from osf import features
from website import settings
@@ -31,7 +30,12 @@ def test_mailhog_received_mail(self):
mailhog_v2 = f'{settings.MAILHOG_API_HOST}/api/v2/messages'
requests.delete(mailhog_v1)
- send_mail('to_addr@mail.com', TEST, name='Mailhog')
+ send_mail(
+ 'test email',
+ 'test message',
+ from_email=settings.OSF_CONTACT_EMAIL,
+ recipient_list=['to_addr@mail.com',]
+ )
res = requests.get(mailhog_v2).json()
assert res['count'] == 1
assert res['items'][0]['Content']['Headers']['To'][0] == 'to_addr@mail.com'
diff --git a/notifications.yaml b/notifications.yaml
index fe5186bb8fd..65ffa263181 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -93,11 +93,17 @@ notification_types:
- name: user_export_data_request
__docs__: ...
object_content_type_model_name: osfuser
- template: 'website/templates/emails/.html.mako'
+ template: 'website/templates/emails/support_request.html.mako'
- name: user_request_deactivation
+ subject: '[via OSF] Deactivation Request'
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/support_request.html.mako'
+ - name: user_request_deactivation_complete
+ subject: '[via OSF] OSF account deactivated'
__docs__: ...
object_content_type_model_name: osfuser
- template: 'website/templates/emails/.html.mako'
+ template: 'website/templates/emails/request_deactivation_complete.html.mako'
- name: user_storage_cap_exceeded_announcement
__docs__: ...
object_content_type_model_name: osfuser
@@ -153,6 +159,16 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/spam_user_banned.html.mako'
+ - name: user_file_operation_success
+ subject: 'Your ${action} has finished'
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/file_operation_success.html.mako'
+ - name: user_file_operation_failed
+ subject: 'Your ${action} has failed'
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/file_operation_failed.html.mako'
#### PROVIDER
- name: provider_new_pending_submissions
@@ -336,11 +352,11 @@ notification_types:
- name: addon_boa_job_failure
__docs__: ...
object_content_type_model_name: desk
- template: 'website/templates/emails/addon_boa_job_failure.html.mako'
+ template: 'website/templates/emails/addons_boa_job_failure.html.mako'
- name: addon_boa_job_complete
__docs__: ...
object_content_type_model_name: desk
- template: 'website/templates/emails/addon_boa_job_complete.html.mako'
+ template: 'website/templates/emails/addons_boa_job_complete.html.mako'
- name: desk_archive_job_copy_error
__docs__: Archive job failed due to copy error. Sent to support desk.
object_content_type_model_name: desk
diff --git a/osf/management/commands/populate_collection_provider_notification_subscriptions.py b/osf/management/commands/populate_collection_provider_notification_subscriptions.py
deleted file mode 100644
index c3a21eb8d20..00000000000
--- a/osf/management/commands/populate_collection_provider_notification_subscriptions.py
+++ /dev/null
@@ -1,40 +0,0 @@
-import logging
-
-from django.core.management.base import BaseCommand
-from osf.models import NotificationSubscriptionLegacy, CollectionProvider
-
-logger = logging.getLogger(__file__)
-
-
-def populate_collection_provider_notification_subscriptions():
- for provider in CollectionProvider.objects.all():
- provider_admins = provider.get_group('admin').user_set.all()
- provider_moderators = provider.get_group('moderator').user_set.all()
-
- for subscription in provider.DEFAULT_SUBSCRIPTIONS:
- instance, created = NotificationSubscriptionLegacy.objects.get_or_create(
- _id=f'{provider._id}_{subscription}',
- event_name=subscription,
- provider=provider
- )
-
- if created:
- logger.info(f'{provider._id}_{subscription} NotificationSubscription object has been created')
- else:
- logger.info(f'{provider._id}_{subscription} NotificationSubscription object exists')
-
- for user in provider_admins | provider_moderators:
- # add user to subscription list but set their notification to none by default
- instance.add_user_to_subscription(user, 'email_transactional', save=True)
- logger.info(f'User {user._id} is subscribed to {provider._id}_{subscription}')
-
-
-class Command(BaseCommand):
- help = """
- Creates NotificationSubscriptions for existing RegistrationProvider objects
- and adds RegistrationProvider moderators/admins to subscriptions
- """
-
- # Management command handler
- def handle(self, *args, **options):
- populate_collection_provider_notification_subscriptions()
diff --git a/osf/management/commands/populate_registration_provider_notification_subscriptions.py b/osf/management/commands/populate_registration_provider_notification_subscriptions.py
deleted file mode 100644
index db4b44acba5..00000000000
--- a/osf/management/commands/populate_registration_provider_notification_subscriptions.py
+++ /dev/null
@@ -1,45 +0,0 @@
-import logging
-
-from django.contrib.auth.models import Group
-from django.core.management.base import BaseCommand
-from osf.models import RegistrationProvider, NotificationSubscriptionLegacy
-
-logger = logging.getLogger(__file__)
-
-
-def populate_registration_provider_notification_subscriptions():
- for provider in RegistrationProvider.objects.all():
- try:
- provider_admins = provider.get_group('admin').user_set.all()
- provider_moderators = provider.get_group('moderator').user_set.all()
- except Group.DoesNotExist:
- logger.warning(f'Unable to find groups for provider "{provider._id}", assuming there are no subscriptions to create.')
- continue
-
- for subscription in provider.DEFAULT_SUBSCRIPTIONS:
- instance, created = NotificationSubscriptionLegacy.objects.get_or_create(
- _id=f'{provider._id}_{subscription}',
- event_name=subscription,
- provider=provider
- )
-
- if created:
- logger.info(f'{provider._id}_{subscription} NotificationSubscription object has been created')
- else:
- logger.info(f'{provider._id}_{subscription} NotificationSubscription object exists')
-
- for user in provider_admins | provider_moderators:
- # add user to subscription list but set their notification to none by default
- instance.add_user_to_subscription(user, 'email_transactional', save=True)
- logger.info(f'User {user._id} is subscribed to {provider._id}_{subscription}')
-
-
-class Command(BaseCommand):
- help = """
- Creates NotificationSubscriptions for existing RegistrationProvider objects
- and adds RegistrationProvider moderators/admins to subscriptions
- """
-
- # Management command handler
- def handle(self, *args, **options):
- populate_registration_provider_notification_subscriptions()
diff --git a/osf/models/mixins.py b/osf/models/mixins.py
index 3cbb2283aab..33463e09924 100644
--- a/osf/models/mixins.py
+++ b/osf/models/mixins.py
@@ -1029,7 +1029,9 @@ class Meta:
reviews_comments_private = models.BooleanField(null=True, blank=True)
reviews_comments_anonymous = models.BooleanField(null=True, blank=True)
- DEFAULT_SUBSCRIPTIONS = ['new_pending_submissions']
+ DEFAULT_SUBSCRIPTIONS = [
+ NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
+ ]
@property
def is_reviewed(self):
@@ -1463,7 +1465,7 @@ def add_contributor(self, contributor, permissions=None, visible=True,
self,
contributor=contributor,
auth=auth,
- email_template=send_email,
+ notification_type=send_email,
permissions=permissions
)
diff --git a/osf/models/node.py b/osf/models/node.py
index fb7a7f1e102..f4fdb2c2122 100644
--- a/osf/models/node.py
+++ b/osf/models/node.py
@@ -1339,11 +1339,17 @@ def subscribe_contributors_to_node(self):
and send emails to users that they have been added to the project.
(DraftNodes are hidden until registration).
"""
+ from . import NotificationType
+
for user in self.contributors.filter(is_registered=True):
perm = self.contributor_set.get(user=user).permission
- project_signals.contributor_added.send(self,
- contributor=user,
- auth=None, email_template='default', permissions=perm)
+ project_signals.contributor_added.send(
+ self,
+ contributor=user,
+ auth=None,
+ notification_type=NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT,
+ permissions=perm
+ )
def register_node(self, schema, auth, draft_registration, parent=None, child_ids=None, provider=None, manual_guid=None):
"""Make a frozen copy of a node.
@@ -1671,7 +1677,12 @@ def fork_node(self, auth, title=None, parent=None):
forked.save()
# Need to call this after save for the notifications to be created with the _primary_key
- project_signals.contributor_added.send(forked, contributor=user, auth=auth, email_template='false')
+ project_signals.contributor_added.send(
+ forked,
+ contributor=user,
+ auth=auth,
+ notification_type=None
+ )
return forked
@@ -1780,7 +1791,12 @@ def use_as_template(self, auth, changes=None, top_level=True, parent=None):
new.save(suppress_log=True)
# Need to call this after save for the notifications to be created with the _primary_key
- project_signals.contributor_added.send(new, contributor=auth.user, auth=auth, email_template='false')
+ project_signals.contributor_added.send(
+ new,
+ contributor=auth.user,
+ auth=auth,
+ notification_type=None
+ )
# Log the creation
new.add_log(
diff --git a/osf/models/notification.py b/osf/models/notification.py
index 1b749af2b9b..fb9922078e4 100644
--- a/osf/models/notification.py
+++ b/osf/models/notification.py
@@ -46,7 +46,7 @@ def send(
f"\nto={recipient_address}"
f"\ntype={self.subscription.notification_type.name}"
f"\ncontext={self.event_context}"
- f"\nemail_context={self.email_context}"
+ f"\nemail_context={email_context}"
)
elif protocol_type == 'email':
email.send_email_with_send_grid(
diff --git a/osf/models/provider.py b/osf/models/provider.py
index b8dacc174bf..38b0affb035 100644
--- a/osf/models/provider.py
+++ b/osf/models/provider.py
@@ -2,6 +2,7 @@
import requests
from django.apps import apps
+from django.contrib.contenttypes.models import ContentType
from django.contrib.postgres import fields
from django.core.exceptions import ValidationError
from django.db import connection
@@ -14,12 +15,12 @@
from guardian.models import GroupObjectPermissionBase, UserObjectPermissionBase
from framework import sentry
+from . import NotificationType, NotificationSubscription
from .base import BaseModel, TypedObjectIDMixin
from .mixins import ReviewProviderMixin
from .brand import Brand
from .citation import CitationStyle
from .licenses import NodeLicense
-from .notifications import NotificationSubscriptionLegacy
from .storage import ProviderAssetFile
from .subject import Subject
from osf.utils.datetime_aware_jsonfield import DateTimeAwareJSONField
@@ -252,7 +253,9 @@ def setup_share_source(self, provider_home_page):
class CollectionProvider(AbstractProvider):
- DEFAULT_SUBSCRIPTIONS = ['new_pending_submissions']
+ DEFAULT_SUBSCRIPTIONS = [
+ NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
+ ]
class Meta:
permissions = (
@@ -292,7 +295,11 @@ class RegistrationProvider(AbstractProvider):
REVIEW_STATES = RegistrationModerationStates
STATE_FIELD_NAME = 'moderation_state'
- DEFAULT_SUBSCRIPTIONS = ['new_pending_submissions', 'new_pending_withdraw_requests']
+ DEFAULT_SUBSCRIPTIONS = [
+ NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS,
+ NotificationType.Type.PROVIDER_NEW_PENDING_WITHDRAW_REQUESTS,
+
+ ]
# A list of dictionaries describing new fields that providers want to surface on their registrations
# Each entry must provide a 'field_name' key. In the future, other keys may be supported to enable
@@ -464,10 +471,10 @@ def create_provider_auth_groups(sender, instance, created, **kwargs):
def create_provider_notification_subscriptions(sender, instance, created, **kwargs):
if created:
for subscription in instance.DEFAULT_SUBSCRIPTIONS:
- NotificationSubscriptionLegacy.objects.get_or_create(
- _id=f'{instance._id}_{subscription}',
- event_name=subscription,
- provider=instance
+ NotificationSubscription.objects.get_or_create(
+ notification_type__name=subscription,
+ object_id=instance.id,
+ content_type=ContentType.objects.get_for_model(instance)
)
diff --git a/osf/models/registrations.py b/osf/models/registrations.py
index e62bf5f14bf..e4f4106b563 100644
--- a/osf/models/registrations.py
+++ b/osf/models/registrations.py
@@ -21,6 +21,7 @@
from osf.exceptions import NodeStateError, DraftRegistrationStateError
from osf.external.internet_archive.tasks import archive_to_ia, update_ia_metadata
from osf.metrics import RegistriesModerationMetrics
+from . import NotificationType
from .action import RegistrationAction
from .archive import ArchiveJob
from .contributor import DraftRegistrationContributor
@@ -1316,7 +1317,7 @@ def create_from_node(cls, user, schema, node=None, data=None, provider=None):
draft,
contributor=user,
auth=None,
- email_template='draft_registration',
+ notification_type=NotificationType.Type.USER_CONTRIBUTOR_ADDED_DRAFT_REGISTRATION,
permissions=initiator_permissions
)
diff --git a/osf_tests/test_collection_submission.py b/osf_tests/test_collection_submission.py
index d2dd906b692..e169d3c5582 100644
--- a/osf_tests/test_collection_submission.py
+++ b/osf_tests/test_collection_submission.py
@@ -12,7 +12,6 @@
from osf.utils.workflows import CollectionSubmissionStates
from framework.exceptions import PermissionsError
from api_tests.utils import UserRoles
-from osf.management.commands.populate_collection_provider_notification_subscriptions import populate_collection_provider_notification_subscriptions
from django.utils import timezone
from tests.utils import capture_notifications
@@ -150,10 +149,6 @@ class TestModeratedCollectionSubmission:
MOCK_NOW = timezone.now()
- @pytest.fixture(autouse=True)
- def setup(self):
- populate_collection_provider_notification_subscriptions()
-
def test_submit(self, moderated_collection_submission):
# .submit on post_save
assert moderated_collection_submission.state == CollectionSubmissionStates.PENDING
@@ -179,7 +174,6 @@ def test_notify_moderators_pending(self, node, moderated_collection):
collection=moderated_collection,
creator=node.creator,
)
- populate_collection_provider_notification_subscriptions()
collection_submission.save()
assert len(notifications) == 2
assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_SUBMITTED
diff --git a/osf_tests/test_schema_responses.py b/osf_tests/test_schema_responses.py
index 3b3af1458cf..1226c24c353 100644
--- a/osf_tests/test_schema_responses.py
+++ b/osf_tests/test_schema_responses.py
@@ -1,4 +1,3 @@
-from unittest import mock
import pytest
from api.providers.workflows import Workflows
@@ -11,8 +10,6 @@
from osf_tests.utils import get_default_test_schema, _ensure_subscriptions
from tests.utils import capture_notifications
-from website.notifications import emails
-
from transitions import MachineError
# See osf_tests.utils.default_test_schema for block types and valid answers
@@ -870,13 +867,11 @@ def test_moderators_notified_on_admin_approval(self, revised_response, admin_use
revised_response.save()
revised_response.pending_approvers.add(admin_user)
- store_emails = emails.store_emails
- with mock.patch.object(emails, 'store_emails', autospec=True) as mock_store:
- mock_store.side_effect = store_emails
+ with capture_notifications() as notifications:
revised_response.approve(user=admin_user)
-
- assert mock_store.called
- assert mock_store.call_args[0][0] == [moderator._id]
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[0]['kwargs']['user'] == moderator
def test_no_moderator_notification_on_admin_approval_of_initial_response(
self, initial_response, admin_user):
@@ -884,9 +879,9 @@ def test_no_moderator_notification_on_admin_approval_of_initial_response(
initial_response.save()
initial_response.pending_approvers.add(admin_user)
- with mock.patch.object(emails, 'store_emails', autospec=True) as mock_store:
+ with capture_notifications() as notifications:
initial_response.approve(user=admin_user)
- assert not mock_store.called
+ assert not notifications
def test_moderator_accept(self, initial_response, moderator):
initial_response.approvals_state_machine.set_state(ApprovalStates.PENDING_MODERATION)
diff --git a/osf_tests/utils.py b/osf_tests/utils.py
index b3f3c92bc88..ecfd046d1b2 100644
--- a/osf_tests/utils.py
+++ b/osf_tests/utils.py
@@ -3,6 +3,8 @@
import functools
from unittest import mock
+from django.contrib.contenttypes.models import ContentType
+
from framework.auth import Auth
from django.utils import timezone
from google.cloud.storage import Client, Bucket, Blob
@@ -16,7 +18,7 @@
Sanction,
RegistrationProvider,
RegistrationSchema,
- NotificationSubscriptionLegacy
+ NotificationSubscription
)
from osf.utils.migrations import create_schema_blocks_for_atomic_schema
@@ -228,11 +230,11 @@ def _ensure_subscriptions(provider):
This has led to observed race conditions and probabalistic test failures.
Avoid that.
'''
- for subscription in provider.DEFAULT_SUBSCRIPTIONS:
- NotificationSubscriptionLegacy.objects.get_or_create(
- _id=f'{provider._id}_{subscription}',
- event_name=subscription,
- provider=provider
+ for notification_type in provider.DEFAULT_SUBSCRIPTIONS:
+ NotificationSubscription.objects.get_or_create(
+ notification_type=notification_type,
+ object_id=provider.id,
+ content_type=ContentType.objects.get_for_model(provider)
)
def assert_notification_correctness(send_mail_mock, expected_template, expected_recipients):
diff --git a/tests/test_events.py b/tests/test_events.py
index e98119e61b9..812eb1608a1 100644
--- a/tests/test_events.py
+++ b/tests/test_events.py
@@ -135,7 +135,7 @@ def setUp(self):
self.user_2 = factories.AuthUserFactory()
self.project = factories.ProjectFactory(creator=self.user_1)
# subscription
- self.sub = factories.NotificationSubscriptionLegacyFactory(
+ self.sub = factories.NotificationSubscriptionFactory(
_id=self.project._id + 'file_updated',
owner=self.project,
event_name='file_updated',
@@ -161,7 +161,7 @@ def setUp(self):
self.user = factories.UserFactory()
self.consolidate_auth = Auth(user=self.user)
self.project = factories.ProjectFactory()
- self.project_subscription = factories.NotificationSubscriptionLegacyFactory(
+ self.project_subscription = factories.NotificationSubscriptionFactory(
_id=self.project._id + '_file_updated',
owner=self.project,
event_name='file_updated'
@@ -253,7 +253,7 @@ def setUp(self):
self.user_2 = factories.AuthUserFactory()
self.project = factories.ProjectFactory(creator=self.user_1)
# subscription
- self.sub = factories.NotificationSubscriptionLegacyFactory(
+ self.sub = factories.NotificationSubscriptionFactory(
_id=self.project._id + 'file_updated',
owner=self.project,
event_name='file_updated',
@@ -307,21 +307,21 @@ def setUp(self):
)
# Subscriptions
# for parent node
- self.sub = factories.NotificationSubscriptionLegacyFactory(
+ self.sub = factories.NotificationSubscriptionFactory(
_id=self.project._id + '_file_updated',
owner=self.project,
event_name='file_updated'
)
self.sub.save()
# for private node
- self.private_sub = factories.NotificationSubscriptionLegacyFactory(
+ self.private_sub = factories.NotificationSubscriptionFactory(
_id=self.private_node._id + '_file_updated',
owner=self.private_node,
event_name='file_updated'
)
self.private_sub.save()
# for file subscription
- self.file_sub = factories.NotificationSubscriptionLegacyFactory(
+ self.file_sub = factories.NotificationSubscriptionFactory(
_id='{pid}_{wbid}_file_updated'.format(
pid=self.project._id,
wbid=self.event.waterbutler_id
@@ -337,8 +337,7 @@ def test_info_formed_correct(self):
# assert 'moved file "{}".' == self.event.html_message
# assert 'created folder "Three/".' == self.event.text_message
- @mock.patch('website.notifications.emails.store_emails')
- def test_user_performing_action_no_email(self, mock_store):
+ def test_user_performing_action_no_email(self):
# Move Event: Makes sure user who performed the action is not
# included in the notifications
self.sub.email_digest.add(self.user_2)
@@ -346,16 +345,13 @@ def test_user_performing_action_no_email(self, mock_store):
self.event.perform()
assert 0 == mock_store.call_count
- @mock.patch('website.notifications.emails.store_emails')
- def test_perform_store_called_once(self, mock_store):
- # Move Event: Tests that store_emails is called once from perform
+ def test_perform_store_called_once(self):
self.sub.email_transactional.add(self.user_1)
self.sub.save()
self.event.perform()
assert 1 == mock_store.call_count
- @mock.patch('website.notifications.emails.store_emails')
- def test_perform_store_one_of_each(self, mock_store):
+ def test_perform_store_one_of_each(self):
# Move Event: Tests that store_emails is called 3 times, one in
# each category
self.sub.email_transactional.add(self.user_1)
@@ -372,8 +368,7 @@ def test_perform_store_one_of_each(self, mock_store):
self.event.perform()
assert 3 == mock_store.call_count
- @mock.patch('website.notifications.emails.store_emails')
- def test_remove_user_sent_once(self, mock_store):
+ def test_remove_user_sent_once(self):
# Move Event: Tests removed user is removed once. Regression
self.project.add_contributor(self.user_3, permissions=WRITE, auth=self.auth)
self.project.save()
@@ -402,21 +397,21 @@ def setUp(self):
)
# Subscriptions
# for parent node
- self.sub = factories.NotificationSubscriptionLegacyFactory(
+ self.sub = factories.NotificationSubscriptionFactory(
_id=self.project._id + '_file_updated',
owner=self.project,
event_name='file_updated'
)
self.sub.save()
# for private node
- self.private_sub = factories.NotificationSubscriptionLegacyFactory(
+ self.private_sub = factories.NotificationSubscriptionFactory(
_id=self.private_node._id + '_file_updated',
owner=self.private_node,
event_name='file_updated'
)
self.private_sub.save()
# for file subscription
- self.file_sub = factories.NotificationSubscriptionLegacyFactory(
+ self.file_sub = factories.NotificationSubscriptionFactory(
_id='{pid}_{wbid}_file_updated'.format(
pid=self.project._id,
wbid=self.event.waterbutler_id
@@ -436,8 +431,7 @@ def test_info_correct(self):
' in Consolidate to "Two/Paper13.txt" in OSF'
' Storage in Consolidate.') == self.event.text_message
- @mock.patch('website.notifications.emails.store_emails')
- def test_copied_one_of_each(self, mock_store):
+ def test_copied_one_of_each(self):
# Copy Event: Tests that store_emails is called 2 times, two with
# permissions, one without
self.sub.email_transactional.add(self.user_1)
@@ -454,8 +448,7 @@ def test_copied_one_of_each(self, mock_store):
self.event.perform()
assert 2 == mock_store.call_count
- @mock.patch('website.notifications.emails.store_emails')
- def test_user_performing_action_no_email(self, mock_store):
+ def test_user_performing_action_no_email(self):
# Move Event: Makes sure user who performed the action is not
# included in the notifications
self.sub.email_digest.add(self.user_2)
@@ -484,21 +477,21 @@ def setUp(self):
)
# Subscriptions
# for parent node
- self.sub = factories.NotificationSubscriptionLegacyFactory(
+ self.sub = factories.NotificationSubscriptionFactory(
_id=self.project._id + '_file_updated',
owner=self.project,
event_name='file_updated'
)
self.sub.save()
# for private node
- self.private_sub = factories.NotificationSubscriptionLegacyFactory(
+ self.private_sub = factories.NotificationSubscriptionFactory(
_id=self.private_node._id + '_file_updated',
owner=self.private_node,
event_name='file_updated'
)
self.private_sub.save()
# for file subscription
- self.file_sub = factories.NotificationSubscriptionLegacyFactory(
+ self.file_sub = factories.NotificationSubscriptionFactory(
_id='{pid}_{wbid}_file_updated'.format(
pid=self.project._id,
wbid=self.event.waterbutler_id
diff --git a/website/mails/mails.py b/website/mails/mails.py
index 83ab3afc613..0dec8be81a2 100644
--- a/website/mails/mails.py
+++ b/website/mails/mails.py
@@ -178,9 +178,6 @@ def get_english_article(word):
subject='Your access request to an OSF project has been approved'
)
REQUEST_EXPORT = Mail('support_request', subject='[via OSF] Export Request')
-REQUEST_DEACTIVATION = Mail('support_request', subject='[via OSF] Deactivation Request')
-
-REQUEST_DEACTIVATION_COMPLETE = Mail('request_deactivation_complete', subject='[via OSF] OSF account deactivated')
SPAM_USER_BANNED = Mail('spam_user_banned', subject='[OSF] Account flagged as spam')
SPAM_FILES_DETECTED = Mail(
@@ -188,17 +185,6 @@ def get_english_article(word):
subject='[auto] Spam files audit'
)
-
-DIGEST = Mail(
- 'digest', subject='OSF Notifications',
- categories=['notifications', 'notifications-digest']
-)
-
-DIGEST_REVIEWS_MODERATORS = Mail(
- 'digest_reviews_moderators',
- subject='Recent submissions to ${provider_name}',
-)
-
# Retraction related Mail objects
PENDING_RETRACTION_ADMIN = Mail(
'pending_retraction_admin',
diff --git a/website/notifications/emails.py b/website/notifications/emails.py
index 9c34867ad3a..7a22ba8954c 100644
--- a/website/notifications/emails.py
+++ b/website/notifications/emails.py
@@ -5,7 +5,7 @@
from osf.models import AbstractNode, NotificationSubscription, NotificationType
from osf.models.notifications import NotificationDigest
-from osf.utils.permissions import ADMIN, READ
+from osf.utils.permissions import READ
from website import mails
from website.notifications import constants
from website.notifications import utils
@@ -32,32 +32,6 @@ def notify(event, user, node, timestamp, **context):
event_context=context
)
-def notify_mentions(event, user, node, timestamp, **context):
- OSFUser = apps.get_model('osf', 'OSFUser')
- recipient_ids = context.get('new_mentions', [])
- recipients = OSFUser.objects.filter(guids___id__in=recipient_ids)
- sent_users = notify_global_event(event, user, node, timestamp, recipients, context=context)
- return sent_users
-
-def notify_global_event(event, sender_user, node, timestamp, recipients, template=None, context=None):
- event_type = utils.find_subscription_type(event)
- sent_users = []
- if not context:
- context = {}
-
- for recipient in recipients:
- subscriptions = get_user_subscriptions(recipient, event_type)
- context['is_creator'] = recipient == node.creator
- if node.provider:
- context['has_psyarxiv_chronos_text'] = node.has_permission(recipient, ADMIN) and 'psyarxiv' in node.provider.name.lower()
- for notification_type in subscriptions:
- if (notification_type != 'none' and subscriptions[notification_type] and recipient._id in subscriptions[notification_type]):
- store_emails([recipient._id], notification_type, event, sender_user, node, timestamp, template=template, **context)
- sent_users.append(recipient._id)
-
- return sent_users
-
-
def store_emails(recipient_ids, notification_type, event, user, node, timestamp, abstract_provider=None, template=None, **context):
"""Store notification emails
diff --git a/website/notifications/events/files.py b/website/notifications/events/files.py
index 6a7c7cab3d9..9de4f342daf 100644
--- a/website/notifications/events/files.py
+++ b/website/notifications/events/files.py
@@ -19,7 +19,7 @@
RegistryError,
)
from website.notifications.events import utils as event_utils
-from osf.models import AbstractNode, NodeLog, Preprint
+from osf.models import AbstractNode, NodeLog, Preprint, NotificationType
from addons.base.signals import file_updated as signal
@@ -278,12 +278,28 @@ def perform(self):
)
# Move the document from one subscription to another because the old one isn't needed
- utils.move_subscription(rm_users, self.event_type, self.source_node, self.event_type, self.node)
+ utils.move_subscription(
+ rm_users,
+ self.event_type,
+ self.source_node,
+ self.event_type,
+ self.node
+ )
+
# Notify each user
for notification in NOTIFICATION_TYPES:
if notification == 'none':
continue
if moved[notification]:
+ NotificationType.objects.get(
+ name=NotificationType.Type.NODE_ADDON_FILE_MOVED,
+ ).emit(
+ user=self.user,
+ event_context={
+ 'profile_image_url': self.profile_image_url,
+ 'url': self.url
+ }
+ )
emails.store_emails(moved[notification], notification, 'file_updated', self.user, self.node,
self.timestamp, message=self.html_message,
profile_image_url=self.profile_image_url, url=self.url)
diff --git a/website/notifications/listeners.py b/website/notifications/listeners.py
index ed9a936492f..c2e82c872db 100644
--- a/website/notifications/listeners.py
+++ b/website/notifications/listeners.py
@@ -1,22 +1,36 @@
import logging
+
+from osf import apps
+from osf.models import NotificationType, Node
from website.project.signals import contributor_added, project_created
from framework.auth.signals import user_confirmed
logger = logging.getLogger(__name__)
@project_created.connect
-def subscribe_creator(node):
- if node.is_collection or node.is_deleted:
+def subscribe_creator(resource):
+ if resource.is_collection or resource.is_deleted:
return None
from website.notifications.utils import subscribe_user_to_notifications
- subscribe_user_to_notifications(node, node.creator)
+ subscribe_user_to_notifications(resource, resource.creator)
@contributor_added.connect
-def subscribe_contributor(node, contributor, auth=None, *args, **kwargs):
+def subscribe_contributor(resource, contributor, auth=None, *args, **kwargs):
from website.notifications.utils import subscribe_user_to_notifications
- subscribe_user_to_notifications(node, contributor)
+ if isinstance(resource, Node) == 'osf.node':
+ if resource.is_collection or resource.is_deleted:
+ return None
+ subscribe_user_to_notifications(resource, contributor)
@user_confirmed.connect
def subscribe_confirmed_user(user):
- from website.notifications.utils import subscribe_user_to_global_notifications
- subscribe_user_to_global_notifications(user)
+ NotificationSubscription = apps.get_model('osf.NotificationSubscription')
+ user_events = [
+ NotificationType.Type.USER_FILE_UPDATED,
+ NotificationType.Type.USER_REVIEWS,
+ ]
+ for user_event in user_events:
+ NotificationSubscription.objects.get_or_create(
+ user=user,
+ notification_type=user_event
+ )
diff --git a/website/notifications/utils.py b/website/notifications/utils.py
index d9ceadfc39b..3b41c3435c0 100644
--- a/website/notifications/utils.py
+++ b/website/notifications/utils.py
@@ -5,7 +5,7 @@
from django.db.models import Q
from framework.postcommit_tasks.handlers import run_postcommit
-from osf.models import NotificationSubscription
+from osf.models import NotificationSubscription, NotificationType
from osf.utils.permissions import READ
from website.notifications import constants
from website.notifications.exceptions import InvalidSubscriptionError
@@ -95,10 +95,10 @@ def remove_supplemental_node(node):
@app.task(max_retries=5, default_retry_delay=60)
def remove_subscription_task(node_id):
AbstractNode = apps.get_model('osf.AbstractNode')
- NotificationSubscriptionLegacy = apps.get_model('osf.NotificationSubscriptionLegacy')
+ NotificationSubscription = apps.get_model('osf.NotificationSubscription')
node = AbstractNode.load(node_id)
- NotificationSubscriptionLegacy.objects.filter(node=node).delete()
+ NotificationSubscription.objects.filter(node=node).delete()
parent = node.parent_node
if parent and parent.child_node_subscriptions:
@@ -172,11 +172,11 @@ def move_subscription(remove_users, source_event, source_node, new_event, new_no
:param new_node: Instance of Node
:return: Returns a NOTIFICATION_TYPES list of removed users without permissions
"""
- NotificationSubscriptionLegacy = apps.get_model('osf.NotificationSubscriptionLegacy')
+ NotificationSubscription = apps.get_model('osf.NotificationSubscription')
OSFUser = apps.get_model('osf.OSFUser')
if source_node == new_node:
return
- old_sub = NotificationSubscriptionLegacy.load(to_subscription_key(source_node._id, source_event))
+ old_sub = NotificationSubscription.load(to_subscription_key(source_node._id, source_event))
if not old_sub:
return
elif old_sub:
@@ -236,8 +236,8 @@ def check_project_subscriptions_are_all_none(user, node):
def get_all_user_subscriptions(user, extra=None):
""" Get all Subscription objects that the user is subscribed to"""
- NotificationSubscriptionLegacy = apps.get_model('osf.NotificationSubscriptionLegacy')
- queryset = NotificationSubscriptionLegacy.objects.filter(
+ NotificationSubscription = apps.get_model('osf.NotificationSubscription')
+ queryset = NotificationSubscription.objects.filter(
Q(none=user.pk) |
Q(email_digest=user.pk) |
Q(email_transactional=user.pk)
@@ -391,14 +391,13 @@ def get_parent_notification_type(node, event, user):
:return: str notification type (e.g. 'email_transactional')
"""
AbstractNode = apps.get_model('osf.AbstractNode')
- NotificationSubscriptionLegacy = apps.get_model('osf.NotificationSubscriptionLegacy')
if node and isinstance(node, AbstractNode) and node.parent_node and node.parent_node.has_permission(user, READ):
parent = node.parent_node
key = to_subscription_key(parent._id, event)
try:
- subscription = NotificationSubscriptionLegacy.objects.get(_id=key)
- except NotificationSubscriptionLegacy.DoesNotExist:
+ subscription = NotificationSubscription.objects.get(_id=key)
+ except NotificationSubscription.DoesNotExist:
return get_parent_notification_type(parent, event, user)
for notification_type in constants.NOTIFICATION_TYPES:
@@ -424,60 +423,25 @@ def get_global_notification_type(global_subscription, user):
return notification_type
-def check_if_all_global_subscriptions_are_none(user):
- # This function predates comment mentions, which is a global_ notification that cannot be disabled
- # Therefore, an actual check would never return True.
- # If this changes, an optimized query would look something like:
- # not NotificationSubscriptionLegacy.objects.filter(Q(event_name__startswith='global_') & (Q(email_digest=user.pk)|Q(email_transactional=user.pk))).exists()
- return False
-
-
-def subscribe_user_to_global_notifications(user):
- NotificationSubscriptionLegacy = apps.get_model('osf.NotificationSubscriptionLegacy')
- notification_type = 'email_transactional'
- user_events = constants.USER_SUBSCRIPTIONS_AVAILABLE
- for user_event in user_events:
- user_event_id = to_subscription_key(user._id, user_event)
-
- # get_or_create saves on creation
- subscription, created = NotificationSubscriptionLegacy.objects.get_or_create(_id=user_event_id, user=user, event_name=user_event)
- subscription.add_user_to_subscription(user, notification_type)
- subscription.save()
-
-
def subscribe_user_to_notifications(node, user):
""" Update the notification settings for the creator or contributors
:param user: User to subscribe to notifications
"""
- Preprint = apps.get_model('osf.Preprint')
- DraftRegistration = apps.get_model('osf.DraftRegistration')
- if isinstance(node, Preprint):
- raise InvalidSubscriptionError('Preprints are invalid targets for subscriptions at this time.')
-
- if isinstance(node, DraftRegistration):
- raise InvalidSubscriptionError('DraftRegistrations are invalid targets for subscriptions at this time.')
-
- if node.is_collection:
- raise InvalidSubscriptionError('Collections are invalid targets for subscriptions')
-
- if node.is_deleted:
- raise InvalidSubscriptionError('Deleted Nodes are invalid targets for subscriptions')
if getattr(node, 'is_registration', False):
raise InvalidSubscriptionError('Registrations are invalid targets for subscriptions')
if user.is_registered:
- for event in ['node_file_updated',]:
- subscription, _ = NotificationSubscription.objects.get_or_create(
- user=user,
- notification_type__name=event
- )
- subscription, _ = NotificationSubscription.objects.get_or_create(
- user=user,
- notification_type__name=event,
- object_id=node.id,
- content_type=ContentType.objects.get_for_model(node)
- )
+ NotificationSubscription.objects.get_or_create(
+ user=user,
+ notification_type__name=NotificationType.Type.USER_FILE_UPDATED,
+ )
+ NotificationSubscription.objects.get_or_create(
+ user=user,
+ notification_type__name=NotificationType.Type.NODE_FILE_UPDATED,
+ object_id=node.id,
+ content_type=ContentType.objects.get_for_model(node)
+ )
def format_user_and_project_subscriptions(user):
diff --git a/website/project/views/contributor.py b/website/project/views/contributor.py
index 0800afaf8ca..50ff050ceea 100644
--- a/website/project/views/contributor.py
+++ b/website/project/views/contributor.py
@@ -619,7 +619,7 @@ def check_email_throttle(node, contributor, throttle=None):
return False # No previous sent notification, not throttled
@contributor_added.connect
-def notify_added_contributor(node, contributor, email_template, auth=None, *args, **kwargs):
+def notify_added_contributor(node, contributor, notification_type, auth=None, *args, **kwargs):
"""Send a notification to a contributor who was just added to a node.
Handles:
@@ -631,15 +631,15 @@ def notify_added_contributor(node, contributor, email_template, auth=None, *args
node (AbstractNode): The node to which the contributor was added.
contributor (OSFUser): The user being added.
auth (Auth, optional): Authorization context.
- email_template (str, optional): Template identifier.
+ notification_type (str, optional): Template identifier.
"""
if check_email_throttle_claim_email(node, contributor):
return
- if email_template == 'false':
+ if not notification_type:
return
# Default values
- notification_type = email_template or NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ notification_type = notification_type or NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
logo = settings.OSF_LOGO
# Use match for notification type/logic
@@ -659,13 +659,13 @@ def notify_added_contributor(node, contributor, email_template, auth=None, *args
notification_type = NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_PREPRINT_NODE_FROM_OSF
logo = settings.OSF_PREPRINTS_LOGO
else:
- raise NotImplementedError(f'email_template: {email_template} not implemented.')
+ raise NotImplementedError(f'notification_type: {notification_type} not implemented.')
NotificationType.objects.get(name=notification_type).emit(
user=contributor,
event_context={
'user': contributor.id,
- 'node': node.id,
+ 'node': node.title,
'referrer_name': getattr(getattr(auth, 'user', None), 'fullname', '') if auth else '',
'is_initiator': getattr(getattr(auth, 'user', None), 'id', None) == contributor.id if auth else False,
'all_global_subscriptions_none': False,
From 2e98358973a9136f02b196421bcb90b2142de561 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Sun, 27 Jul 2025 19:10:49 -0400
Subject: [PATCH 126/336] remove child node subscription list
---
api/subscriptions/serializers.py | 22 +---
...e_abstractnode_child_node_subscriptions.py | 17 +++
osf/models/node.py | 4 -
osf/models/notifications.py | 13 --
website/notifications/events/files.py | 115 +++---------------
website/notifications/listeners.py | 6 +-
website/notifications/utils.py | 13 +-
website/templates/emails/empty.html.mako | 1 -
website/templates/emails/test.html.mako | 1 -
9 files changed, 48 insertions(+), 144 deletions(-)
create mode 100644 osf/migrations/0034_remove_abstractnode_child_node_subscriptions.py
delete mode 100644 website/templates/emails/empty.html.mako
delete mode 100644 website/templates/emails/test.html.mako
diff --git a/api/subscriptions/serializers.py b/api/subscriptions/serializers.py
index ede0782ae65..1b7e6449833 100644
--- a/api/subscriptions/serializers.py
+++ b/api/subscriptions/serializers.py
@@ -1,9 +1,7 @@
-from django.contrib.contenttypes.models import ContentType
from rest_framework import serializers as ser
from api.nodes.serializers import RegistrationProviderRelationshipField
from api.collections_providers.fields import CollectionProviderRelationshipField
from api.preprints.serializers import PreprintProviderRelationshipField
-from osf.models import Node
from website.util import api_v2_url
@@ -23,7 +21,10 @@ class SubscriptionSerializer(JSONAPISerializer):
help_text='The id of the subscription fixed for backward compatibility',
)
event_name = ser.CharField(read_only=True)
- frequency = FrequencyField(source='message_frequency', required=True)
+ frequency = FrequencyField(
+ source='message_frequency',
+ required=True,
+ )
class Meta:
type_ = 'subscription'
@@ -36,20 +37,7 @@ def get_absolute_url(self, obj):
return obj.absolute_api_v2_url
def update(self, instance, validated_data):
- user = self.context['request'].user
- frequency = validated_data.get('frequency') or 'none'
- instance.message_frequency = frequency
-
- if frequency != 'none' and instance.content_type == ContentType.objects.get_for_model(Node):
- node = Node.objects.get(
- id=instance.id,
- content_type=instance.content_type,
- )
- user_subs = node.parent_node.child_node_subscriptions
- if node._id not in user_subs.setdefault(user._id, []):
- user_subs[user._id].append(node._id)
- node.parent_node.save()
-
+ instance.message_frequency = validated_data.get['frequency']
return instance
diff --git a/osf/migrations/0034_remove_abstractnode_child_node_subscriptions.py b/osf/migrations/0034_remove_abstractnode_child_node_subscriptions.py
new file mode 100644
index 00000000000..79bd4ec9243
--- /dev/null
+++ b/osf/migrations/0034_remove_abstractnode_child_node_subscriptions.py
@@ -0,0 +1,17 @@
+# Generated by Django 4.2.13 on 2025-07-27 23:06
+
+from django.db import migrations
+
+
+class Migration(migrations.Migration):
+
+ dependencies = [
+ ('osf', '0033_delete_queuedmail'),
+ ]
+
+ operations = [
+ migrations.RemoveField(
+ model_name='abstractnode',
+ name='child_node_subscriptions',
+ ),
+ ]
diff --git a/osf/models/node.py b/osf/models/node.py
index f4fdb2c2122..4ea827e5b4a 100644
--- a/osf/models/node.py
+++ b/osf/models/node.py
@@ -318,10 +318,6 @@ class AbstractNode(DirtyFieldsMixin, TypedModel, AddonModelMixin, IdentifierMixi
) SELECT {fields} FROM "{nodelicenserecord}"
WHERE id = (SELECT node_license_id FROM ascendants WHERE node_license_id IS NOT NULL) LIMIT 1;""")
- # Dictionary field mapping user id to a list of nodes in node.nodes which the user has subscriptions for
- # {: [, , ...] }
- # TODO: Can this be a reference instead of data?
- child_node_subscriptions = DateTimeAwareJSONField(default=dict, blank=True)
_contributors = models.ManyToManyField(OSFUser,
through=Contributor,
related_name='nodes')
diff --git a/osf/models/notifications.py b/osf/models/notifications.py
index 41ec120b4ee..80703f1620f 100644
--- a/osf/models/notifications.py
+++ b/osf/models/notifications.py
@@ -70,12 +70,6 @@ def add_user_to_subscription(self, user, notification_type, save=True):
if nt == notification_type:
getattr(self, nt).add(user)
- if notification_type != 'none' and isinstance(self.owner, Node) and self.owner.parent_node:
- user_subs = self.owner.parent_node.child_node_subscriptions
- if self.owner._id not in user_subs.setdefault(user._id, []):
- user_subs[user._id].append(self.owner._id)
- self.owner.parent_node.save()
-
if save:
# Do not clean legacy objects
self.save(clean=False)
@@ -87,13 +81,6 @@ def remove_user_from_subscription(self, user, save=True):
except ValueError:
pass
- if isinstance(self.owner, Node) and self.owner.parent_node:
- try:
- self.owner.parent_node.child_node_subscriptions.get(user._id, []).remove(self.owner._id)
- self.owner.parent_node.save()
- except ValueError:
- pass
-
if save:
self.save()
diff --git a/website/notifications/events/files.py b/website/notifications/events/files.py
index 9de4f342daf..aa8aca2f32b 100644
--- a/website/notifications/events/files.py
+++ b/website/notifications/events/files.py
@@ -9,16 +9,12 @@
from furl import furl
import markupsafe
-from website.notifications import emails
-from website.notifications.constants import NOTIFICATION_TYPES
-from website.notifications import utils
from website.notifications.events.base import (
register,
Event,
event_registry,
RegistryError,
)
-from website.notifications.events import utils as event_utils
from osf.models import AbstractNode, NodeLog, Preprint, NotificationType
from addons.base.signals import file_updated as signal
@@ -236,82 +232,17 @@ def perform(self):
if self.node == self.source_node:
super().perform()
return
- # File
- if self.payload['destination']['kind'] != 'folder':
- moved, warn, rm_users = event_utils.categorize_users(
- self.user,
- self.event_type,
- self.source_node,
- self.event_type,
- self.node
- )
- warn_message = f'{self.html_message} You are no longer tracking that file based on the settings you selected for the component.'
- remove_message = (
- f'{self.html_message} Your subscription has been removed due to '
- 'insufficient permissions in the new component.'
- )
- # Folder
- else:
- # Gets all the files in a folder to look for permissions conflicts
- files = event_utils.get_file_subs_from_folder(
- self.addon,
- self.user,
- self.payload['destination']['kind'],
- self.payload['destination']['path'],
- self.payload['destination']['name']
- )
- # Bins users into different permissions
- moved, warn, rm_users = event_utils.compile_user_lists(
- files,
- self.user,
- self.source_node,
- self.node
- )
- # For users that don't have individual file subscription but has permission on the new node
- warn_message = f'{self.html_message} You are no longer tracking that folder or files within based on the settings you selected for the component.'
- # For users without permission on the new node
- remove_message = (
- f'{self.html_message} Your subscription has been removed for the '
- 'folder, or a file within, due to insufficient permissions in the new '
- 'component.'
- )
-
- # Move the document from one subscription to another because the old one isn't needed
- utils.move_subscription(
- rm_users,
- self.event_type,
- self.source_node,
- self.event_type,
- self.node
+ NotificationType.objects.get(
+ name=NotificationType.Type.NODE_ADDON_FILE_MOVED,
+ ).emit(
+ user=self.user,
+ event_context={
+ 'profile_image_url': self.profile_image_url,
+ 'url': self.url
+ }
)
- # Notify each user
- for notification in NOTIFICATION_TYPES:
- if notification == 'none':
- continue
- if moved[notification]:
- NotificationType.objects.get(
- name=NotificationType.Type.NODE_ADDON_FILE_MOVED,
- ).emit(
- user=self.user,
- event_context={
- 'profile_image_url': self.profile_image_url,
- 'url': self.url
- }
- )
- emails.store_emails(moved[notification], notification, 'file_updated', self.user, self.node,
- self.timestamp, message=self.html_message,
- profile_image_url=self.profile_image_url, url=self.url)
- if warn[notification]:
- emails.store_emails(warn[notification], notification, 'file_updated', self.user, self.node,
- self.timestamp, message=warn_message, profile_image_url=self.profile_image_url,
- url=self.url)
- if rm_users[notification]:
- emails.store_emails(rm_users[notification], notification, 'file_updated', self.user, self.source_node,
- self.timestamp, message=remove_message,
- profile_image_url=self.profile_image_url, url=self.source_url)
-
@register(NodeLog.FILE_COPIED)
class AddonFileCopied(ComplexFileEvent):
@@ -324,26 +255,16 @@ def perform(self):
together because they both don't have a subscription to a
newly copied file.
"""
- remove_message = self.html_message + ' You do not have permission in the new component.'
if self.node == self.source_node:
super().perform()
return
- if self.payload['destination']['kind'] != 'folder':
- moved, warn, rm_users = event_utils.categorize_users(self.user, self.event_type, self.source_node,
- self.event_type, self.node)
- else:
- files = event_utils.get_file_subs_from_folder(self.addon, self.user, self.payload['destination']['kind'],
- self.payload['destination']['path'],
- self.payload['destination']['name'])
- moved, warn, rm_users = event_utils.compile_user_lists(files, self.user, self.source_node, self.node)
- for notification in NOTIFICATION_TYPES:
- if notification == 'none':
- continue
- if moved[notification] or warn[notification]:
- users = list(set(moved[notification]).union(set(warn[notification])))
- emails.store_emails(users, notification, 'file_updated', self.user, self.node, self.timestamp,
- message=self.html_message, profile_image_url=self.profile_image_url, url=self.url)
- if rm_users[notification]:
- emails.store_emails(rm_users[notification], notification, 'file_updated', self.user, self.source_node,
- self.timestamp, message=remove_message,
- profile_image_url=self.profile_image_url, url=self.source_url)
+
+ NotificationType.objects.get(
+ name=NotificationType.Type.NODE_ADDON_FILE_MOVED,
+ ).emit(
+ user=self.user,
+ event_context={
+ 'profile_image_url': self.profile_image_url,
+ 'url': self.url
+ }
+ )
diff --git a/website/notifications/listeners.py b/website/notifications/listeners.py
index c2e82c872db..4447fa971d7 100644
--- a/website/notifications/listeners.py
+++ b/website/notifications/listeners.py
@@ -1,7 +1,6 @@
import logging
from osf import apps
-from osf.models import NotificationType, Node
from website.project.signals import contributor_added, project_created
from framework.auth.signals import user_confirmed
@@ -17,7 +16,9 @@ def subscribe_creator(resource):
@contributor_added.connect
def subscribe_contributor(resource, contributor, auth=None, *args, **kwargs):
from website.notifications.utils import subscribe_user_to_notifications
- if isinstance(resource, Node) == 'osf.node':
+ from osf.models import Node
+
+ if isinstance(resource, Node):
if resource.is_collection or resource.is_deleted:
return None
subscribe_user_to_notifications(resource, contributor)
@@ -25,6 +26,7 @@ def subscribe_contributor(resource, contributor, auth=None, *args, **kwargs):
@user_confirmed.connect
def subscribe_confirmed_user(user):
NotificationSubscription = apps.get_model('osf.NotificationSubscription')
+ NotificationType = apps.get_model('osf.NotificationType')
user_events = [
NotificationType.Type.USER_FILE_UPDATED,
NotificationType.Type.USER_REVIEWS,
diff --git a/website/notifications/utils.py b/website/notifications/utils.py
index 3b41c3435c0..331b2162acf 100644
--- a/website/notifications/utils.py
+++ b/website/notifications/utils.py
@@ -96,16 +96,11 @@ def remove_supplemental_node(node):
def remove_subscription_task(node_id):
AbstractNode = apps.get_model('osf.AbstractNode')
NotificationSubscription = apps.get_model('osf.NotificationSubscription')
-
node = AbstractNode.load(node_id)
- NotificationSubscription.objects.filter(node=node).delete()
- parent = node.parent_node
-
- if parent and parent.child_node_subscriptions:
- for user_id in parent.child_node_subscriptions:
- if node._id in parent.child_node_subscriptions[user_id]:
- parent.child_node_subscriptions[user_id].remove(node._id)
- parent.save()
+ NotificationSubscription.objects.filter(
+ object_id=node.id,
+ content_type=ContentType.objects.get_for_model(node),
+ ).delete()
@run_postcommit(once_per_request=False, celery=True)
diff --git a/website/templates/emails/empty.html.mako b/website/templates/emails/empty.html.mako
deleted file mode 100644
index c78480affe2..00000000000
--- a/website/templates/emails/empty.html.mako
+++ /dev/null
@@ -1 +0,0 @@
-
This message is coming from an Institutional administrator within your Institution.
@@ -14,12 +14,12 @@
% endif
- Want more information? Visit ${settings.DOMAIN} to learn about OSF, or
+ Want more information? Visit ${domain} to learn about OSF, or
https://cos.io/ for information about its supporting organization, the Center
for Open Science.
- User: ${user.fullname} (${user.username}) [${user._id}]
+ User: ${user_fullname} (${user_username}) [${user__id}]
- Tried to register ${src.title} (${url}) [${src._id}], but the archive task failed when copying files.
+ Tried to register ${src_title} (${url}) [${src__id}], but the archive task failed when copying files.
A report is included below:
diff --git a/website/templates/emails/archive_copy_error_user.html.mako b/website/templates/emails/archive_copy_error_user.html.mako
index 310bd4f5a6b..ce7566f322c 100644
--- a/website/templates/emails/archive_copy_error_user.html.mako
+++ b/website/templates/emails/archive_copy_error_user.html.mako
@@ -5,12 +5,12 @@
- We cannot archive ${src.title} at this time because there were errors copying files from some of the linked third-party services. It's possible that this is due to temporary unavailability of one or more of these services and that retrying the registration may resolve this issue. Our development team is investigating this failure. We're sorry for any inconvenience this may have caused.
+ We cannot archive ${src_title} at this time because there were errors copying files from some of the linked third-party services. It's possible that this is due to temporary unavailability of one or more of these services and that retrying the registration may resolve this issue. Our development team is investigating this failure. We're sorry for any inconvenience this may have caused.
- User: ${user.fullname} (${user.username}) [${user._id}]
+ User: ${user_fullname} (${user_username}) [${user._id}]
- Tried to register ${src.title} (${url}), but the resulting archive would have exceeded our caps for disk usage (${settings.MAX_ARCHIVE_SIZE / 1024 ** 3}GB).
+ Tried to register ${src_title} (${url}), but the resulting archive would have exceeded our caps for disk usage (${settings.MAX_ARCHIVE_SIZE / 1024 ** 3}GB).
A report is included below:
diff --git a/website/templates/emails/archive_size_exceeded_user.html.mako b/website/templates/emails/archive_size_exceeded_user.html.mako
index d30498bc222..ef852bed8d4 100644
--- a/website/templates/emails/archive_size_exceeded_user.html.mako
+++ b/website/templates/emails/archive_size_exceeded_user.html.mako
@@ -4,12 +4,12 @@
- We cannot archive ${src.title} at this time because the projected size of the registration exceeds our usage limits. You should receive a followup email from our support team shortly. We're sorry for any inconvenience this may have caused.
+ We cannot archive ${src_title} at this time because the projected size of the registration exceeds our usage limits. You should receive a followup email from our support team shortly. We're sorry for any inconvenience this may have caused.
%def>
From 9e8bd85811ebda68ff4c13582e9680c206d034fa Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Mon, 28 Jul 2025 18:50:36 -0400
Subject: [PATCH 133/336] fix schema response tests
---
.../views/test_request_actions_create.py | 2 +-
api_tests/users/views/test_user_confirm.py | 2 +-
api_tests/users/views/test_user_settings.py | 2 +-
.../test_user_settings_reset_password.py | 2 +-
notifications.yaml | 6 +++
osf/utils/machines.py | 2 +-
osf_tests/test_collection.py | 2 +-
osf_tests/test_node.py | 39 ++++++++++---------
osf_tests/test_reviewable.py | 1 -
osf_tests/test_schema_responses.py | 4 +-
osf_tests/test_user.py | 2 +-
scripts/osfstorage/usage_audit.py | 3 +-
website/mails/mails.py | 12 ------
website/settings/defaults.py | 7 ----
14 files changed, 38 insertions(+), 48 deletions(-)
diff --git a/api_tests/requests/views/test_request_actions_create.py b/api_tests/requests/views/test_request_actions_create.py
index a8b71da01f4..ff277ac0233 100644
--- a/api_tests/requests/views/test_request_actions_create.py
+++ b/api_tests/requests/views/test_request_actions_create.py
@@ -199,7 +199,7 @@ def test_email_sent_on_approve(self, app, admin, url, node_request):
with capture_notifications() as notifications:
res = app.post_json_api(url, payload, auth=admin.auth)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ assert notifications[0]['type'] == NotificationType.Type.USER_CONTRIBUTOR_ADDED_ACCESS_REQUEST
assert res.status_code == 201
node_request.reload()
assert initial_state != node_request.machine_state
diff --git a/api_tests/users/views/test_user_confirm.py b/api_tests/users/views/test_user_confirm.py
index bb2acee47c9..72c35091890 100644
--- a/api_tests/users/views/test_user_confirm.py
+++ b/api_tests/users/views/test_user_confirm.py
@@ -170,7 +170,7 @@ def test_post_success_link(self, app, confirm_url, user_with_email_verification)
assert res.status_code == 201
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
+ assert notifications[0]['type'] == NotificationType.Type.USER_EXTERNAL_LOGIN_LINK_SUCCESS
user.reload()
assert user.external_identity['ORCID']['0000-0000-0000-0000'] == 'VERIFIED'
diff --git a/api_tests/users/views/test_user_settings.py b/api_tests/users/views/test_user_settings.py
index 847576d9913..530b8455c3c 100644
--- a/api_tests/users/views/test_user_settings.py
+++ b/api_tests/users/views/test_user_settings.py
@@ -60,7 +60,7 @@ def test_post(self, app, user_one, user_two, url, payload):
with capture_notifications() as notification:
res = app.post_json_api(url, payload, auth=user_one.auth)
assert len(notification) == 1
- assert notification[0]['type'] == NotificationType.Type.USER_ACCOUNT_EXPORT_FORM
+ assert notification[0]['type'] == NotificationType.Type.USER_REQUEST_EXPORT
assert res.status_code == 204
user_one.reload()
assert user_one.email_last_sent is not None
diff --git a/api_tests/users/views/test_user_settings_reset_password.py b/api_tests/users/views/test_user_settings_reset_password.py
index 0dbdbaec996..2a9c0e272af 100644
--- a/api_tests/users/views/test_user_settings_reset_password.py
+++ b/api_tests/users/views/test_user_settings_reset_password.py
@@ -36,7 +36,7 @@ def test_get(self, app, url, user_one):
with capture_notifications() as notification:
res = app.get(url)
assert len(notification) == 1
- assert notification[0]['type'] == NotificationType.Type.RESET_PASSWORD_CONFIRMATION
+ assert notification[0]['type'] == NotificationType.Type.USER_PASSWORD_RESET
assert res.status_code == 200
user_one.reload()
diff --git a/notifications.yaml b/notifications.yaml
index 7082541e164..c93f214a042 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -51,7 +51,13 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/forgot_password.html.mako'
+ - name: user_welcome
+ subject: 'Welcome to OSF'
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/welcome.html.mako'
- name: user_welcome_osf4i
+ subject: 'Welcome to OSF'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/welcome_osf4i.html.mako'
diff --git a/osf/utils/machines.py b/osf/utils/machines.py
index aadc8d7849a..b686afc6c43 100644
--- a/osf/utils/machines.py
+++ b/osf/utils/machines.py
@@ -179,7 +179,7 @@ def notify_withdraw(self, ev):
# If there is no preprint request action, it means the withdrawal is directly initiated by admin/moderator
context['force_withdrawal'] = True
- context['requester_fullname'] = requester.fullname
+ context['requester_fullname'] = self.machineable.creator.fullname
for contributor in self.machineable.contributors.all():
context['contributor_fullname'] = contributor.fullname
if context.get('requester_fullname', None):
diff --git a/osf_tests/test_collection.py b/osf_tests/test_collection.py
index 0e39c011f65..912a0e5ec93 100644
--- a/osf_tests/test_collection.py
+++ b/osf_tests/test_collection.py
@@ -133,7 +133,7 @@ def test_node_removed_from_collection_on_privacy_change_notify(self, auth, provi
with capture_notifications() as notifications:
provider_collected_node.set_privacy('private', auth=auth)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[0]['type'] == NotificationType.Type.COLLECTION_SUBMISSION_REMOVED_PRIVATE
@mock.patch('osf.models.node.Node.check_privacy_change_viability', mock.Mock()) # mocks the storage usage limits
def test_node_removed_from_collection_on_privacy_change_no_provider(self, auth, collected_node, bookmark_collection):
diff --git a/osf_tests/test_node.py b/osf_tests/test_node.py
index f00f822704a..3b04ceba292 100644
--- a/osf_tests/test_node.py
+++ b/osf_tests/test_node.py
@@ -34,7 +34,7 @@
NodeRelation,
Registration,
DraftRegistration,
- CollectionSubmission
+ CollectionSubmission, NotificationType
)
from addons.wiki.models import WikiPage, WikiVersion
@@ -42,6 +42,7 @@
from osf.exceptions import ValidationError, ValidationValueError, UserStateError
from osf.utils.workflows import DefaultStates, CollectionSubmissionStates
from framework.auth.core import Auth
+from tests.utils import capture_notifications
from osf_tests.factories import (
AuthUserFactory,
@@ -2125,23 +2126,25 @@ def test_set_privacy(self, node, auth):
assert node.logs.first().action == NodeLog.MADE_PRIVATE
assert last_logged_before_method_call != node.last_logged
- @mock.patch('osf.models.queued_mail.queue_mail')
- def test_set_privacy_sends_mail_default(self, mock_queue, node, auth):
- node.set_privacy('private', auth=auth)
- node.set_privacy('public', auth=auth)
- assert mock_queue.call_count == 1
-
- @mock.patch('osf.models.queued_mail.queue_mail')
- def test_set_privacy_sends_mail(self, mock_queue, node, auth):
- node.set_privacy('private', auth=auth)
- node.set_privacy('public', auth=auth, meeting_creation=False)
- assert mock_queue.call_count == 1
-
- @mock.patch('osf.models.queued_mail.queue_mail')
- def test_set_privacy_skips_mail_if_meeting(self, mock_queue, node, auth):
- node.set_privacy('private', auth=auth)
- node.set_privacy('public', auth=auth, meeting_creation=True)
- assert bool(mock_queue.called) is False
+ def test_set_privacy_sends_mail_default(self, node, auth):
+ with capture_notifications() as notifications:
+ node.set_privacy('private', auth=auth)
+ node.set_privacy('public', auth=auth)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+
+ def test_set_privacy_sends_mail(self, node, auth):
+ with capture_notifications() as notifications:
+ node.set_privacy('private', auth=auth)
+ node.set_privacy('public', auth=auth, meeting_creation=False)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+
+ def test_set_privacy_skips_mail_if_meeting(self, node, auth):
+ with capture_notifications() as notifications:
+ node.set_privacy('private', auth=auth)
+ node.set_privacy('public', auth=auth, meeting_creation=True)
+ assert not notifications
def test_set_privacy_can_not_cancel_pending_embargo_for_registration(self, node, user, auth):
registration = RegistrationFactory(project=node)
diff --git a/osf_tests/test_reviewable.py b/osf_tests/test_reviewable.py
index eb3783b71bc..08be5390d98 100644
--- a/osf_tests/test_reviewable.py
+++ b/osf_tests/test_reviewable.py
@@ -41,7 +41,6 @@ def test_reject_resubmission_sends_emails(self):
is_published=False
)
assert preprint.machine_state == DefaultStates.INITIAL.value
-
with capture_notifications() as notifications:
preprint.run_submit(user)
assert len(notifications) == 1
diff --git a/osf_tests/test_schema_responses.py b/osf_tests/test_schema_responses.py
index 7b6250f8f25..f3f831224c6 100644
--- a/osf_tests/test_schema_responses.py
+++ b/osf_tests/test_schema_responses.py
@@ -859,7 +859,7 @@ def test_accept_notification_sent_on_admin_approval(self, revised_response, admi
with capture_notifications() as notifications:
revised_response.approve(user=admin_user)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[0]['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED
def test_moderators_notified_on_admin_approval(self, revised_response, admin_user, moderator):
revised_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
@@ -869,7 +869,7 @@ def test_moderators_notified_on_admin_approval(self, revised_response, admin_use
with capture_notifications() as notifications:
revised_response.approve(user=admin_user)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[0]['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED
assert notifications[0]['kwargs']['user'] == moderator
def test_no_moderator_notification_on_admin_approval_of_initial_response(
diff --git a/osf_tests/test_user.py b/osf_tests/test_user.py
index 8a8a6f29d72..7025b5a3d2e 100644
--- a/osf_tests/test_user.py
+++ b/osf_tests/test_user.py
@@ -904,7 +904,7 @@ def test_set_password_notify_default(self, user):
user.save()
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PASSWORD_CHANGED
+ assert notifications[0]['type'] == NotificationType.Type.USER_PASSWORD_RESET
def test_set_password_no_notify(self, user):
old_password = 'password'
diff --git a/scripts/osfstorage/usage_audit.py b/scripts/osfstorage/usage_audit.py
index 8a8ffb6c1f1..c50e3f57640 100644
--- a/scripts/osfstorage/usage_audit.py
+++ b/scripts/osfstorage/usage_audit.py
@@ -25,6 +25,7 @@
from website.app import init_app
from website.settings.defaults import GBs
+from django.core.mail import send_mail
from scripts import utils as scripts_utils
# App must be init'd before django models are imported
@@ -110,7 +111,7 @@ def main(send_email=False):
if lines:
if send_email:
logger.info('Sending email...')
- mails.send_mail('support+scripts@osf.io', mails.EMPTY, body='\n'.join(lines), subject='Script: OsfStorage usage audit', can_change_preferences=False,)
+ send_mail('support+scripts@osf.io', mails.EMPTY, body='\n'.join(lines), subject='Script: OsfStorage usage audit', can_change_preferences=False,)
else:
logger.info(f'send_email is False, not sending email')
logger.info(f'{len(lines)} offending project(s) and user(s) found')
diff --git a/website/mails/mails.py b/website/mails/mails.py
index 8ab4ddcabd5..126f4ef8dfc 100644
--- a/website/mails/mails.py
+++ b/website/mails/mails.py
@@ -223,18 +223,6 @@ def get_english_article(word):
subject='Registration of ' + UNESCAPE + ' complete'
)
-WELCOME = Mail(
- 'welcome',
- subject='Welcome to OSF',
- engagement=True
-)
-
-WELCOME_OSF4I = Mail(
- 'welcome_osf4i',
- subject='Welcome to OSF',
- engagement=True
-)
-
DUPLICATE_ACCOUNTS_OSF4I = Mail(
'duplicate_accounts_sso_osf4i',
subject='Duplicate OSF Accounts'
diff --git a/website/settings/defaults.py b/website/settings/defaults.py
index 5d39c01ab90..a68414b6763 100644
--- a/website/settings/defaults.py
+++ b/website/settings/defaults.py
@@ -457,7 +457,6 @@ class CeleryConfig:
med_pri_modules = {
'framework.email.tasks',
- 'scripts.send_queued_mails',
'scripts.triggered_mails',
'website.mailchimp_utils',
'website.notifications.tasks',
@@ -567,7 +566,6 @@ class CeleryConfig:
'scripts.approve_registrations',
'scripts.approve_embargo_terminations',
'scripts.triggered_mails',
- 'scripts.send_queued_mails',
'scripts.generate_sitemap',
'scripts.premigrate_created_modified',
'scripts.add_missing_identifiers_to_preprints',
@@ -636,11 +634,6 @@ class CeleryConfig:
'schedule': crontab(minute=0, hour=5), # Daily 12 a.m
'kwargs': {'dry_run': False},
},
- 'send_queued_mails': {
- 'task': 'scripts.send_queued_mails',
- 'schedule': crontab(minute=0, hour=17), # Daily 12 p.m.
- 'kwargs': {'dry_run': False},
- },
'new-and-noteworthy': {
'task': 'scripts.populate_new_and_noteworthy_projects',
'schedule': crontab(minute=0, hour=7, day_of_week=6), # Saturday 2:00 a.m.
From b89846f28c69eda62e23d0feb045642db39ed837 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 29 Jul 2025 08:24:20 -0400
Subject: [PATCH 134/336] fix machine actions
---
osf/utils/machines.py | 4 +++-
tests/test_preprints.py | 4 ++--
2 files changed, 5 insertions(+), 3 deletions(-)
diff --git a/osf/utils/machines.py b/osf/utils/machines.py
index b686afc6c43..03f5da9b967 100644
--- a/osf/utils/machines.py
+++ b/osf/utils/machines.py
@@ -175,11 +175,13 @@ def notify_withdraw(self, ev):
trigger='accept'
)
requester = preprint_request_action.target.creator
+
except PreprintRequestAction.DoesNotExist:
# If there is no preprint request action, it means the withdrawal is directly initiated by admin/moderator
context['force_withdrawal'] = True
+ requester = self.machineable.creator
- context['requester_fullname'] = self.machineable.creator.fullname
+ context['requester_fullname'] = requester.fullname
for contributor in self.machineable.contributors.all():
context['contributor_fullname'] = contributor.fullname
if context.get('requester_fullname', None):
diff --git a/tests/test_preprints.py b/tests/test_preprints.py
index 9f16edc1e58..6f1eda5876b 100644
--- a/tests/test_preprints.py
+++ b/tests/test_preprints.py
@@ -1998,12 +1998,12 @@ def test_creator_gets_email(self):
with capture_notifications() as notifications:
self.preprint.set_published(True, auth=Auth(self.user), save=True)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_REVIEWS_SUBMISSION_CONFIRMATION
with capture_notifications() as notifications:
self.preprint_branded.set_published(True, auth=Auth(self.user), save=True)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_REVIEWS_SUBMISSION_CONFIRMATION
class TestPreprintOsfStorage(OsfTestCase):
From 8e9126aefb94439d90b27d60cc21030dd43996eb Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 29 Jul 2025 08:32:10 -0400
Subject: [PATCH 135/336] fix institution deactivation notifications
---
notifications.yaml | 6 ++++++
osf/models/institution.py | 29 ++++++++++++-----------------
osf_tests/test_institution.py | 6 +++---
website/mails/mails.py | 10 ----------
4 files changed, 21 insertions(+), 30 deletions(-)
diff --git a/notifications.yaml b/notifications.yaml
index c93f214a042..c296c9e41a4 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -61,6 +61,11 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/welcome_osf4i.html.mako'
+ - name: user_institution_deactivation
+ subject: "Your OSF login has changed - here's what you need to know!"
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/institution_deactivation.html.mako'
- name: user_invite_preprints_osf
__docs__: ...
object_content_type_model_name: osfuser
@@ -111,6 +116,7 @@ notification_types:
object_content_type_model_name: osfuser
template: 'website/templates/emails/request_deactivation_complete.html.mako'
- name: user_storage_cap_exceeded_announcement
+ subject: 'Action Required to avoid disruption to your OSF project'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/storage_cap_exceeded_announcement.html.mako'
diff --git a/osf/models/institution.py b/osf/models/institution.py
index 737233ca7b8..afb9c259a7e 100644
--- a/osf/models/institution.py
+++ b/osf/models/institution.py
@@ -7,7 +7,6 @@
from django.conf import settings as django_conf_settings
from django.contrib.postgres import fields
-from django.core.mail import send_mail
from django.db import models
from django.db.models.signals import post_save
from django.dispatch import receiver
@@ -15,6 +14,7 @@
from django.utils import timezone
from framework import sentry
+from osf.models.notification_type import NotificationType
from .base import BaseModel, ObjectIDMixin
from .contributor import InstitutionalContributor
from .institution_affiliation import InstitutionAffiliation
@@ -23,7 +23,6 @@
from .storage import InstitutionAssetFile
from .validators import validate_email
from osf.utils.fields import NonNaiveDateTimeField, LowercaseEmailField
-from website import mails
from website import settings as website_settings
logger = logging.getLogger(__name__)
@@ -220,21 +219,17 @@ def _send_deactivation_email(self):
attempts = 0
success = 0
for user in self.get_institution_users():
- try:
- attempts += 1
- send_mail(
- to_addr=user.username,
- mail=mails.INSTITUTION_DEACTIVATION,
- user=user,
- forgot_password_link=f'{website_settings.DOMAIN}{forgot_password}',
- osf_support_email=website_settings.OSF_SUPPORT_EMAIL
- )
- except Exception as e:
- logger.error(f'Failed to send institution deactivation email to user [{user._id}] at [{self._id}]')
- sentry.log_exception(e)
- continue
- else:
- success += 1
+ attempts += 1
+ NotificationType.objects.get(
+ name=NotificationType.Type.USER_INSTITUTION_DEACTIVATION
+ ).emit(
+ user=user,
+ event_context={
+ 'forgot_password_link': f'{website_settings.DOMAIN}{forgot_password}',
+ 'osf_support_email': website_settings.OSF_SUPPORT_EMAIL
+ }
+ )
+ success += 1
logger.info(f'Institution deactivation notification email has been '
f'sent to [{success}/{attempts}] users for [{self._id}]')
diff --git a/osf_tests/test_institution.py b/osf_tests/test_institution.py
index d4442ad8590..98ee5b0bfbb 100644
--- a/osf_tests/test_institution.py
+++ b/osf_tests/test_institution.py
@@ -157,8 +157,8 @@ def test_send_deactivation_email_call_count(self):
with capture_notifications() as notifications:
institution._send_deactivation_email()
assert len(notifications) == 2
- assert notifications[0]['type'] == NotificationType.Type.NODE_REQUEST_ACCESS_DENIED
- assert notifications[1]['type'] == NotificationType.Type.NODE_REQUEST_ACCESS_DENIED
+ assert notifications[0]['type'] == NotificationType.Type.USER_INSTITUTION_DEACTIVATION
+ assert notifications[1]['type'] == NotificationType.Type.USER_INSTITUTION_DEACTIVATION
def test_send_deactivation_email_call_args(self):
institution = InstitutionFactory()
@@ -168,7 +168,7 @@ def test_send_deactivation_email_call_args(self):
with capture_notifications() as notifications:
institution._send_deactivation_email()
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_REQUEST_ACCESS_DENIED
+ assert notifications[0]['type'] == NotificationType.Type.USER_INSTITUTION_DEACTIVATION
def test_deactivate_inactive_institution_noop(self):
institution = InstitutionFactory()
diff --git a/website/mails/mails.py b/website/mails/mails.py
index 126f4ef8dfc..033f23fc819 100644
--- a/website/mails/mails.py
+++ b/website/mails/mails.py
@@ -280,16 +280,6 @@ def get_english_article(word):
subject='Updated Terms of Use for COS Websites and Services',
)
-STORAGE_CAP_EXCEEDED_ANNOUNCEMENT = Mail(
- 'storage_cap_exceeded_announcement',
- subject='Action Required to avoid disruption to your OSF project',
-)
-
-INSTITUTION_DEACTIVATION = Mail(
- 'institution_deactivation',
- subject='Your OSF login has changed - here\'s what you need to know!'
-)
-
REGISTRATION_BULK_UPLOAD_PRODUCT_OWNER = Mail(
'registration_bulk_upload_product_owner',
subject='Registry Could Not Bulk Upload Registrations'
From df19159a8199e100a7b91ac3768cdcc9cae99264 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 29 Jul 2025 09:10:21 -0400
Subject: [PATCH 136/336] fix preprint moderation
---
.../views/test_preprint_contributors_list.py | 8 ++++----
.../registrations/views/test_registration_detail.py | 5 +----
api_tests/users/views/test_user_settings.py | 2 +-
.../users/views/test_user_settings_reset_password.py | 2 +-
osf/models/mixins.py | 9 ++++++---
osf/models/notification_subscription.py | 11 +++--------
osf/models/notification_type.py | 5 ++---
website/notifications/utils.py | 2 +-
8 files changed, 19 insertions(+), 25 deletions(-)
diff --git a/api_tests/preprints/views/test_preprint_contributors_list.py b/api_tests/preprints/views/test_preprint_contributors_list.py
index a719589563c..4dbbea685f9 100644
--- a/api_tests/preprints/views/test_preprint_contributors_list.py
+++ b/api_tests/preprints/views/test_preprint_contributors_list.py
@@ -1421,7 +1421,7 @@ def test_add_contributor_signal_if_preprint(
)
assert res.status_code == 201
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_CONTRIBUTOR_ADDED_OSF_PREPRINT
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONTRIBUTOR_ADDED_PREPRINT
def test_add_unregistered_contributor_sends_email(
self, app, user, url_preprint_contribs):
@@ -1440,7 +1440,7 @@ def test_add_unregistered_contributor_sends_email(
auth=user.auth
)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_CONTRIBUTOR_ADDED_OSF_PREPRINT
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONTRIBUTOR_ADDED_PREPRINT
assert res.status_code == 201
def test_add_unregistered_contributor_signal_if_preprint(self, app, user, url_preprint_contribs):
@@ -1460,7 +1460,7 @@ def test_add_unregistered_contributor_signal_if_preprint(self, app, user, url_pr
)
assert res.status_code == 201
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_CONTRIBUTOR_ADDED_OSF_PREPRINT
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONTRIBUTOR_ADDED_PREPRINT
def test_add_contributor_invalid_send_email_param(self, app, user, url_preprint_contribs):
url = f'{url_preprint_contribs}?send_email=true'
@@ -1541,7 +1541,7 @@ def test_contributor_added_signal_not_specified(self, app, user, url_preprint_co
)
assert res.status_code == 201
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_CONTRIBUTOR_ADDED_OSF_PREPRINT
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONTRIBUTOR_ADDED_PREPRINT
def test_contributor_added_not_sent_if_unpublished(
self, app, user, preprint_unpublished):
diff --git a/api_tests/registrations/views/test_registration_detail.py b/api_tests/registrations/views/test_registration_detail.py
index 1be2d14c3be..04aba5ac394 100644
--- a/api_tests/registrations/views/test_registration_detail.py
+++ b/api_tests/registrations/views/test_registration_detail.py
@@ -752,10 +752,7 @@ def test_initiate_withdraw_registration_fails(
assert res.status_code == 400
def test_initiate_withdrawal_success(self, app, user, public_registration, public_url, public_payload):
- with capture_notifications() as notifications:
- res = app.put_json_api(public_url, public_payload, auth=user.auth)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.USER_REVIEWS
+ res = app.put_json_api(public_url, public_payload, auth=user.auth)
assert res.status_code == 200
assert res.json['data']['attributes']['pending_withdrawal'] is True
public_registration.refresh_from_db()
diff --git a/api_tests/users/views/test_user_settings.py b/api_tests/users/views/test_user_settings.py
index 530b8455c3c..927b7892d71 100644
--- a/api_tests/users/views/test_user_settings.py
+++ b/api_tests/users/views/test_user_settings.py
@@ -60,7 +60,7 @@ def test_post(self, app, user_one, user_two, url, payload):
with capture_notifications() as notification:
res = app.post_json_api(url, payload, auth=user_one.auth)
assert len(notification) == 1
- assert notification[0]['type'] == NotificationType.Type.USER_REQUEST_EXPORT
+ assert notification[0]['type'] == NotificationType.Type.DESK_REQUEST_EXPORT
assert res.status_code == 204
user_one.reload()
assert user_one.email_last_sent is not None
diff --git a/api_tests/users/views/test_user_settings_reset_password.py b/api_tests/users/views/test_user_settings_reset_password.py
index 2a9c0e272af..d69eb87a692 100644
--- a/api_tests/users/views/test_user_settings_reset_password.py
+++ b/api_tests/users/views/test_user_settings_reset_password.py
@@ -36,7 +36,7 @@ def test_get(self, app, url, user_one):
with capture_notifications() as notification:
res = app.get(url)
assert len(notification) == 1
- assert notification[0]['type'] == NotificationType.Type.USER_PASSWORD_RESET
+ assert notification[0]['type'] == NotificationType.Type.USER_FORGOT_PASSWORD
assert res.status_code == 200
user_one.reload()
diff --git a/osf/models/mixins.py b/osf/models/mixins.py
index 9405574970d..d224c61ac7c 100644
--- a/osf/models/mixins.py
+++ b/osf/models/mixins.py
@@ -1102,12 +1102,15 @@ def add_user_to_subscription(self, user, subscription):
)
def remove_user_from_subscription(self, user, subscription):
+ notification_type = NotificationType.objects.get(
+ name=subscription,
+ )
subscriptions = NotificationSubscription.objects.filter(
- user=user,
- notification_type=NotificationType.objects.get(name=subscription),
+ notification_type=notification_type,
+ user=user
)
if subscriptions:
- subscriptions.get().remove_user_from_subscription(user)
+ subscriptions.get().remove_user_from_subscription()
class TaxonomizableMixin(models.Model):
diff --git a/osf/models/notification_subscription.py b/osf/models/notification_subscription.py
index 665c67029ff..7dc79047a13 100644
--- a/osf/models/notification_subscription.py
+++ b/osf/models/notification_subscription.py
@@ -2,7 +2,7 @@
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
from django.core.exceptions import ValidationError
-from osf.models.notification_type import get_default_frequency_choices, FrequencyChoices
+from osf.models.notification_type import get_default_frequency_choices
from osf.models.notification import Notification
from .base import BaseModel
@@ -100,12 +100,7 @@ def _id(self):
case _:
raise NotImplementedError()
- def remove_user_from_subscription(self, user):
+ def remove_user_from_subscription(self):
"""
"""
- from osf.models.notification_subscription import NotificationSubscription
- notification, _ = NotificationSubscription.objects.update_or_create(
- user=user,
- notification_type=self,
- defaults={'message_frequency': FrequencyChoices.NONE.value}
- )
+ self.delete()
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 7e7bf72fd6e..6cbb3f1d2df 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -261,11 +261,10 @@ def remove_user_from_subscription(self, user):
"""
"""
from osf.models.notification_subscription import NotificationSubscription
- notification, _ = NotificationSubscription.objects.update_or_create(
+ notification, _ = NotificationSubscription.objects.filter(
user=user,
notification_type=self,
- defaults={'message_frequency': FrequencyChoices.NONE.value}
- )
+ ).delete()
def __str__(self) -> str:
return self.name
diff --git a/website/notifications/utils.py b/website/notifications/utils.py
index b86792f348a..7ccfcf88ede 100644
--- a/website/notifications/utils.py
+++ b/website/notifications/utils.py
@@ -85,7 +85,7 @@ def remove_contributor_from_subscriptions(node, user):
)
for subscription in node_subscriptions:
- subscription.remove_user_from_subscription(user)
+ subscription.remove_user_from_subscription()
@signals.node_deleted.connect
From ddd1cbb2cf91c3bc97556c747c41bdb8675db7f1 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 29 Jul 2025 10:51:21 -0400
Subject: [PATCH 137/336] update node contributor view to include preprints as
resource
---
api/draft_registrations/serializers.py | 4 +-
api/nodes/serializers.py | 19 ++++---
api/nodes/views.py | 4 +-
api/preprints/serializers.py | 4 +-
api/registrations/serializers.py | 4 +-
.../views/test_node_contributors_list.py | 4 +-
.../views/test_preprint_contributors_list.py | 53 ++++++++++---------
osf/models/mixins.py | 5 +-
osf/models/registrations.py | 5 +-
9 files changed, 54 insertions(+), 48 deletions(-)
diff --git a/api/draft_registrations/serializers.py b/api/draft_registrations/serializers.py
index 84d0c48423c..6975393eb02 100644
--- a/api/draft_registrations/serializers.py
+++ b/api/draft_registrations/serializers.py
@@ -13,7 +13,7 @@
NodeLicenseSerializer,
NodeLicenseRelationshipField,
NodeContributorsSerializer,
- NodeContributorsCreateSerializer,
+ ResourceContributorsCreateSerializer,
NodeContributorDetailSerializer,
RegistrationSchemaRelationshipField,
)
@@ -233,7 +233,7 @@ def get_absolute_url(self, obj):
)
-class DraftRegistrationContributorsCreateSerializer(NodeContributorsCreateSerializer, DraftRegistrationContributorsSerializer):
+class DraftRegistrationContributorsCreateSerializer(ResourceContributorsCreateSerializer, DraftRegistrationContributorsSerializer):
"""
Overrides DraftRegistrationContributorsSerializer to add email, full_name, send_email, and non-required index and users field.
diff --git a/api/nodes/serializers.py b/api/nodes/serializers.py
index 80cc400b73d..c52787f2a6c 100644
--- a/api/nodes/serializers.py
+++ b/api/nodes/serializers.py
@@ -1206,7 +1206,7 @@ def get_unregistered_contributor(self, obj):
return unclaimed_records.get('name', None)
-class NodeContributorsCreateSerializer(NodeContributorsSerializer):
+class ResourceContributorsCreateSerializer(NodeContributorsSerializer):
"""
Overrides NodeContributorsSerializer to add email, full_name, send_email, and non-required index and users field.
"""
@@ -1228,13 +1228,13 @@ class NodeContributorsCreateSerializer(NodeContributorsSerializer):
def get_proposed_permissions(self, validated_data):
return validated_data.get('permission') or osf_permissions.DEFAULT_CONTRIBUTOR_PERMISSIONS
- def validate_data(self, node, user_id=None, full_name=None, email=None, index=None):
+ def validate_data(self, resource, user_id=None, full_name=None, email=None, index=None):
if not user_id and not full_name:
raise exceptions.ValidationError(detail='A user ID or full name must be provided to add a contributor.')
if user_id and email:
raise exceptions.ValidationError(detail='Do not provide an email when providing this user_id.')
- if index is not None and index > len(node.contributors):
- raise exceptions.ValidationError(detail=f'{index} is not a valid contributor index for node with id {node._id}')
+ if index is not None and index > len(resource.contributors):
+ raise exceptions.ValidationError(detail=f'{index} is not a valid contributor index for node with id {resource._id}')
def create(self, validated_data):
id = validated_data.get('_id')
@@ -1242,7 +1242,7 @@ def create(self, validated_data):
index = None
if '_order' in validated_data:
index = validated_data.pop('_order')
- node = self.context['resource']
+ resource = self.context['resource']
auth = Auth(self.context['request'].user)
full_name = validated_data.get('full_name')
bibliographic = validated_data.get('bibliographic')
@@ -1250,29 +1250,29 @@ def create(self, validated_data):
permissions = self.get_proposed_permissions(validated_data)
self.validate_data(
- node,
+ resource,
user_id=id,
full_name=full_name,
email=email,
index=index,
)
-
if email_preference not in self.email_preferences:
raise exceptions.ValidationError(detail=f'{email_preference} is not a valid email preference.')
contributor = OSFUser.load(id)
if email or (contributor and contributor.is_registered):
+ is_published = getattr(resource, 'is_published', False)
notification_type = {
'false': False,
'default': NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT,
'draft_registration': NotificationType.Type.DRAFT_REGISTRATION_CONTRIBUTOR_ADDED_DEFAULT,
- 'preprint': NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT,
+ 'preprint': NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT if is_published else False,
}[email_preference]
else:
notification_type = False
try:
- contributor_obj = node.add_contributor_registered_or_not(
+ contributor_obj = resource.add_contributor_registered_or_not(
auth=auth,
user_id=id,
email=email,
@@ -1288,7 +1288,6 @@ def create(self, validated_data):
raise exceptions.NotFound(detail=e.args[0])
return contributor_obj
-
class NodeContributorDetailSerializer(NodeContributorsSerializer):
"""
Overrides node contributor serializer to add additional methods
diff --git a/api/nodes/views.py b/api/nodes/views.py
index 50ba08cb7fe..a105634ca82 100644
--- a/api/nodes/views.py
+++ b/api/nodes/views.py
@@ -110,7 +110,7 @@
NodeContributorsSerializer,
NodeContributorDetailSerializer,
NodeInstitutionsRelationshipSerializer,
- NodeContributorsCreateSerializer,
+ ResourceContributorsCreateSerializer,
NodeViewOnlyLinkSerializer,
NodeViewOnlyLinkUpdateSerializer,
NodeSettingsSerializer,
@@ -442,7 +442,7 @@ def get_serializer_class(self):
if self.request.method == 'PUT' or self.request.method == 'PATCH' or self.request.method == 'DELETE':
return NodeContributorDetailSerializer
elif self.request.method == 'POST':
- return NodeContributorsCreateSerializer
+ return ResourceContributorsCreateSerializer
else:
return NodeContributorsSerializer
diff --git a/api/preprints/serializers.py b/api/preprints/serializers.py
index c0e867510a5..6455dfb8328 100644
--- a/api/preprints/serializers.py
+++ b/api/preprints/serializers.py
@@ -29,7 +29,7 @@
NodeLicenseSerializer,
NodeContributorsSerializer,
NodeStorageProviderSerializer,
- NodeContributorsCreateSerializer,
+ ResourceContributorsCreateSerializer,
NodeContributorDetailSerializer,
get_license_details,
NodeTagField,
@@ -588,7 +588,7 @@ def get_absolute_url(self, obj):
)
-class PreprintContributorsCreateSerializer(NodeContributorsCreateSerializer, PreprintContributorsSerializer):
+class PreprintContributorsCreateSerializer(ResourceContributorsCreateSerializer, PreprintContributorsSerializer):
"""
Overrides PreprintContributorsSerializer to add email, full_name, send_email, and non-required index and users field.
diff --git a/api/registrations/serializers.py b/api/registrations/serializers.py
index 786d76ddccb..10a96a3735b 100644
--- a/api/registrations/serializers.py
+++ b/api/registrations/serializers.py
@@ -24,7 +24,7 @@
NodeLinksSerializer,
NodeLicenseSerializer,
NodeContributorDetailSerializer,
- NodeContributorsCreateSerializer,
+ ResourceContributorsCreateSerializer,
RegistrationProviderRelationshipField,
get_license_details,
)
@@ -934,7 +934,7 @@ def update(self, instance, validated_data):
)
-class RegistrationContributorsCreateSerializer(NodeContributorsCreateSerializer, RegistrationContributorsSerializer):
+class RegistrationContributorsCreateSerializer(ResourceContributorsCreateSerializer, RegistrationContributorsSerializer):
"""
Overrides RegistrationContributorsSerializer to add email, full_name, send_email, and non-required index and users field.
diff --git a/api_tests/nodes/views/test_node_contributors_list.py b/api_tests/nodes/views/test_node_contributors_list.py
index 6983307b1fc..9a85bfddad2 100644
--- a/api_tests/nodes/views/test_node_contributors_list.py
+++ b/api_tests/nodes/views/test_node_contributors_list.py
@@ -4,7 +4,7 @@
import random
from api.base.settings.defaults import API_BASE
-from api.nodes.serializers import NodeContributorsCreateSerializer
+from api.nodes.serializers import ResourceContributorsCreateSerializer
from framework.auth.core import Auth
from osf.models.notification_type import NotificationType
from osf_tests.factories import (
@@ -1153,7 +1153,7 @@ class TestNodeContributorCreateValidation(NodeCRUDTestCase):
@pytest.fixture()
def create_serializer(self):
- return NodeContributorsCreateSerializer
+ return ResourceContributorsCreateSerializer
@pytest.fixture()
def validate_data(self, create_serializer):
diff --git a/api_tests/preprints/views/test_preprint_contributors_list.py b/api_tests/preprints/views/test_preprint_contributors_list.py
index 4dbbea685f9..26716899a61 100644
--- a/api_tests/preprints/views/test_preprint_contributors_list.py
+++ b/api_tests/preprints/views/test_preprint_contributors_list.py
@@ -5,7 +5,7 @@
from django.utils import timezone
from api.base.settings.defaults import API_BASE
-from api.nodes.serializers import NodeContributorsCreateSerializer
+from api.nodes.serializers import ResourceContributorsCreateSerializer
from framework.auth.core import Auth
from osf.models import PreprintLog, NotificationType
from osf_tests.factories import (
@@ -1294,19 +1294,19 @@ class TestPreprintContributorCreateValidation(NodeCRUDTestCase):
@pytest.fixture()
def validate_data(self):
- return NodeContributorsCreateSerializer.validate_data
+ return ResourceContributorsCreateSerializer.validate_data
def test_add_contributor_validation(self, preprint_published, validate_data):
# test_add_contributor_validation_user_id
validate_data(
- NodeContributorsCreateSerializer(),
+ ResourceContributorsCreateSerializer(),
preprint_published,
user_id='abcde')
# test_add_contributor_validation_user_id_fullname
validate_data(
- NodeContributorsCreateSerializer(),
+ ResourceContributorsCreateSerializer(),
preprint_published,
user_id='abcde',
full_name='Kanye')
@@ -1314,7 +1314,7 @@ def test_add_contributor_validation(self, preprint_published, validate_data):
# test_add_contributor_validation_user_id_email
with pytest.raises(exceptions.ValidationError):
validate_data(
- NodeContributorsCreateSerializer(),
+ ResourceContributorsCreateSerializer(),
preprint_published,
user_id='abcde',
email='kanye@west.com')
@@ -1322,7 +1322,7 @@ def test_add_contributor_validation(self, preprint_published, validate_data):
# test_add_contributor_validation_user_id_fullname_email
with pytest.raises(exceptions.ValidationError):
validate_data(
- NodeContributorsCreateSerializer(),
+ ResourceContributorsCreateSerializer(),
preprint_published,
user_id='abcde',
full_name='Kanye',
@@ -1330,20 +1330,20 @@ def test_add_contributor_validation(self, preprint_published, validate_data):
# test_add_contributor_validation_fullname
validate_data(
- NodeContributorsCreateSerializer(),
+ ResourceContributorsCreateSerializer(),
preprint_published,
full_name='Kanye')
# test_add_contributor_validation_email
with pytest.raises(exceptions.ValidationError):
validate_data(
- NodeContributorsCreateSerializer(),
+ ResourceContributorsCreateSerializer(),
preprint_published,
email='kanye@west.com')
# test_add_contributor_validation_fullname_email
validate_data(
- NodeContributorsCreateSerializer(),
+ ResourceContributorsCreateSerializer(),
preprint_published,
full_name='Kanye',
email='kanye@west.com')
@@ -1421,7 +1421,7 @@ def test_add_contributor_signal_if_preprint(
)
assert res.status_code == 201
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONTRIBUTOR_ADDED_PREPRINT
+ assert notifications[0]['type'] == NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT
def test_add_unregistered_contributor_sends_email(
self, app, user, url_preprint_contribs):
@@ -1440,7 +1440,7 @@ def test_add_unregistered_contributor_sends_email(
auth=user.auth
)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONTRIBUTOR_ADDED_PREPRINT
+ assert notifications[0]['type'] == NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT
assert res.status_code == 201
def test_add_unregistered_contributor_signal_if_preprint(self, app, user, url_preprint_contribs):
@@ -1460,7 +1460,7 @@ def test_add_unregistered_contributor_signal_if_preprint(self, app, user, url_pr
)
assert res.status_code == 201
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONTRIBUTOR_ADDED_PREPRINT
+ assert notifications[0]['type'] == NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT
def test_add_contributor_invalid_send_email_param(self, app, user, url_preprint_contribs):
url = f'{url_preprint_contribs}?send_email=true'
@@ -1541,22 +1541,23 @@ def test_contributor_added_signal_not_specified(self, app, user, url_preprint_co
)
assert res.status_code == 201
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_CONTRIBUTOR_ADDED_PREPRINT
+ assert notifications[0]['type'] == NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT
- def test_contributor_added_not_sent_if_unpublished(
- self, app, user, preprint_unpublished):
- url = f'/{API_BASE}preprints/{preprint_unpublished._id}/contributors/?send_email=preprint'
- payload = {
- 'data': {
- 'type': 'contributors',
- 'attributes': {
- 'full_name': 'Kanye West',
- 'email': 'kanye@west.com'
- }
- }
- }
+ def test_contributor_added_not_sent_if_unpublished(self, app, user, preprint_unpublished):
with capture_notifications() as notifications:
- res = app.post_json_api(url, payload, auth=user.auth)
+ res = app.post_json_api(
+ f'/{API_BASE}preprints/{preprint_unpublished._id}/contributors/?send_email=preprint',
+ {
+ 'data': {
+ 'type': 'contributors',
+ 'attributes': {
+ 'full_name': 'Jalen Hurt',
+ 'email': 'one@eagles.com'
+ }
+ }
+ },
+ auth=user.auth
+ )
assert not notifications
assert res.status_code == 201
diff --git a/osf/models/mixins.py b/osf/models/mixins.py
index d224c61ac7c..0dc9f1c1361 100644
--- a/osf/models/mixins.py
+++ b/osf/models/mixins.py
@@ -1428,7 +1428,10 @@ def add_contributor(
if isinstance(self, AbstractNode):
notification_type = NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
elif isinstance(self, Preprint):
- notification_type = NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT
+ if self.is_published:
+ notification_type = NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT
+ else:
+ notification_type = False
elif isinstance(self, DraftRegistration):
notification_type = NotificationType.Type.DRAFT_REGISTRATION_CONTRIBUTOR_ADDED_DEFAULT
diff --git a/osf/models/registrations.py b/osf/models/registrations.py
index 1ef8689643f..5663ccae063 100644
--- a/osf/models/registrations.py
+++ b/osf/models/registrations.py
@@ -653,7 +653,10 @@ def retract_registration(self, user, justification=None, save=True, moderator_in
f'User {user} does not have moderator privileges on Provider {self.provider}')
retraction = self._initiate_retraction(
- user, justification, moderator_initiated=moderator_initiated)
+ user,
+ justification,
+ moderator_initiated=moderator_initiated
+ )
self.retraction = retraction
self.registered_from.add_log(
action=NodeLog.RETRACTION_INITIATED,
From c1c12bd07adbb7498021f916bd4eca1a34688098 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 29 Jul 2025 13:24:36 -0400
Subject: [PATCH 138/336] clean up file notification events
---
addons/base/views.py | 34 +++---
notifications.yaml | 65 ++++++-----
osf/models/notification_type.py | 16 +--
tests/test_events.py | 154 ++++++++++----------------
website/mails/mails.py | 10 --
website/notifications/emails.py | 104 +----------------
website/notifications/events/base.py | 17 +--
website/notifications/events/files.py | 6 +-
website/notifications/events/utils.py | 141 -----------------------
website/notifications/utils.py | 8 +-
website/project/views/comment.py | 3 +-
11 files changed, 137 insertions(+), 421 deletions(-)
delete mode 100644 website/notifications/events/utils.py
diff --git a/addons/base/views.py b/addons/base/views.py
index 4547112e44b..4c3d01bacdf 100644
--- a/addons/base/views.py
+++ b/addons/base/views.py
@@ -611,25 +611,24 @@ def create_waterbutler_log(payload, **kwargs):
file_signals.file_updated.send(
target=node,
user=user,
- event_type=action,
payload=payload
)
- match f'node_{action}':
- case NotificationType.Type.NODE_FILE_ADDED:
- notification = NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_ADDED)
- case NotificationType.Type.NODE_FILE_REMOVED:
- notification = NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_REMOVED)
- case NotificationType.Type.NODE_FILE_UPDATED:
- notification = NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_UPDATED)
- case NotificationType.Type.NODE_ADDON_FILE_RENAMED:
- notification = NotificationType.objects.get(name=NotificationType.Type.NODE_ADDON_FILE_RENAMED)
- case NotificationType.Type.NODE_ADDON_FILE_COPIED:
- notification = NotificationType.objects.get(name=NotificationType.Type.NODE_ADDON_FILE_COPIED)
- case NotificationType.Type.NODE_ADDON_FILE_REMOVED:
- notification = NotificationType.objects.get(name=NotificationType.Type.NODE_ADDON_FILE_REMOVED)
- case NotificationType.Type.NODE_ADDON_FILE_MOVED:
- notification = NotificationType.objects.get(name=NotificationType.Type.NODE_ADDON_FILE_MOVED)
+ match action:
+ case NotificationType.Type.FILE_ADDED:
+ notification = NotificationType.objects.get(name=NotificationType.Type.FILE_ADDED)
+ case NotificationType.Type.FILE_REMOVED:
+ notification = NotificationType.objects.get(name=NotificationType.Type.FILE_REMOVED)
+ case NotificationType.Type.FILE_UPDATED:
+ notification = NotificationType.objects.get(name=NotificationType.Type.FILE_UPDATED)
+ case NotificationType.Type.ADDON_FILE_RENAMED:
+ notification = NotificationType.objects.get(name=NotificationType.Type.ADDON_FILE_RENAMED)
+ case NotificationType.Type.ADDON_FILE_COPIED:
+ notification = NotificationType.objects.get(name=NotificationType.Type.ADDON_FILE_COPIED)
+ case NotificationType.Type.ADDON_FILE_REMOVED:
+ notification = NotificationType.objects.get(name=NotificationType.Type.ADDON_FILE_REMOVED)
+ case NotificationType.Type.ADDON_FILE_MOVED:
+ notification = NotificationType.objects.get(name=NotificationType.Type.ADDON_FILE_MOVED)
case _:
raise NotImplementedError(f'action {action} not implemented')
@@ -647,12 +646,13 @@ def create_waterbutler_log(payload, **kwargs):
@file_signals.file_updated.connect
-def addon_delete_file_node(self, target, user, event_type, payload):
+def addon_delete_file_node(self, target, user, payload):
""" Get addon BaseFileNode(s), move it into the TrashedFileNode collection
and remove it from StoredFileNode.
Required so that the guids of deleted addon files are not re-pointed when an
addon file or folder is moved or renamed.
"""
+ event_type = payload['action']
if event_type == 'file_removed' and payload.get('provider', None) != 'osfstorage':
provider = payload['provider']
path = payload['metadata']['path']
diff --git a/notifications.yaml b/notifications.yaml
index c296c9e41a4..62f636b8546 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -243,7 +243,7 @@ notification_types:
template: 'website/templates/emails/reviews_resubmission_confirmation.html.mako'
#### NODE
- - name: node_file_updated
+ - name: node_wiki_updated
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/file_updated.html.mako'
@@ -251,34 +251,6 @@ notification_types:
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/file_updated.html.mako'
- - name: node_file_added
- __docs__: ...
- object_content_type_model_name: abstractnode
- template: 'website/templates/emails/file_updated.html.mako'
- - name: node_file_removed
- __docs__: ...
- object_content_type_model_name: abstractnode
- template: 'website/templates/emails/file_updated.html.mako'
- - name: node_addon_file_renamed
- __docs__: ...
- object_content_type_model_name: abstractnode
- template: 'website/templates/emails/file_updated.html.mako'
- - name: node_addon_file_copied
- __docs__: ...
- object_content_type_model_name: abstractnode
- template: 'website/templates/emails/file_updated.html.mako'
- - name: node_addon_file_moved
- __docs__: ...
- object_content_type_model_name: abstractnode
- template: 'website/templates/emails/file_updated.html.mako'
- - name: node_addon_file_removed
- __docs__: ...
- object_content_type_model_name: abstractnode
- template: 'website/templates/emails/file_updated.html.mako'
- - name: node_wiki_updated
- __docs__: ...
- object_content_type_model_name: abstractnode
- template: 'website/templates/emails/file_updated.html.mako'
- name: node_institutional_access_request
__docs__: ...
object_content_type_model_name: abstractnode
@@ -308,10 +280,12 @@ notification_types:
object_content_type_model_name: abstractnode
template: 'website/templates/emails/project_affiliation_changed.html.mako'
- name: node_request_access_denied
+ subject: 'Your access request to an OSF project has been declined'
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/access_request_rejected.html.mako'
- name: node_access_request_submitted
+ subject: 'An OSF user has requested access to your ${node.project_or_component}'
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/access_request_submitted.html.mako'
@@ -446,3 +420,36 @@ notification_types:
__docs__: ...
object_content_type_model_name: draftregistration
template: 'website/templates/emails/contributor_added_draft_registration.html.mako'
+### Files
+ - name: file_updated
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/file_updated.html.mako'
+ - name: file_added
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/file_updated.html.mako'
+ - name: file_removed
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/file_updated.html.mako'
+ - name: addon_file_renamed
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/file_updated.html.mako'
+ - name: addon_file_copied
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/file_updated.html.mako'
+ - name: addon_file_moved
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/file_updated.html.mako'
+ - name: addon_file_removed
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/file_updated.html.mako'
+ - name: folder_created
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/file_updated.html.mako'
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 6cbb3f1d2df..55fe70883df 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -98,18 +98,20 @@ class Type(str, Enum):
NODE_PENDING_REGISTRATION_ADMIN = 'node_pending_registration_admin'
NODE_PENDING_EMBARGO_TERMINATION_NON_ADMIN = 'node_pending_embargo_termination_non_admin'
NODE_PENDING_EMBARGO_TERMINATION_ADMIN = 'node_pending_embargo_termination_admin'
- NODE_FILE_UPDATED = 'node_file_updated'
- NODE_FILE_ADDED = 'node_file_added'
- NODE_FILE_REMOVED = 'node_file_removed'
- NODE_ADDON_FILE_COPIED = 'node_addon_file_copied'
- NODE_ADDON_FILE_RENAMED = 'node_addon_file_renamed'
- NODE_ADDON_FILE_MOVED = 'node_addon_file_moved'
- NODE_ADDON_FILE_REMOVED = 'node_addon_file_removed'
NODE_SCHEMA_RESPONSE_REJECTED = 'node_schema_response_rejected'
NODE_SCHEMA_RESPONSE_APPROVED = 'node_schema_response_approved'
NODE_SCHEMA_RESPONSE_SUBMITTED = 'node_schema_response_submitted'
NODE_SCHEMA_RESPONSE_INITIATED = 'node_schema_response_initiated'
+ FILE_UPDATED = 'file_updated'
+ FILE_ADDED = 'file_added'
+ FILE_REMOVED = 'file_removed'
+ ADDON_FILE_COPIED = 'addon_file_copied'
+ ADDON_FILE_RENAMED = 'addon_file_renamed'
+ ADDON_FILE_MOVED = 'addon_file_moved'
+ ADDON_FILE_REMOVED = 'addon_file_removed'
+ FOLDER_CREATED = 'folder_created'
+
# Provider notifications
PROVIDER_NEW_PENDING_SUBMISSIONS = 'provider_new_pending_submissions'
PROVIDER_NEW_PENDING_WITHDRAW_REQUESTS = 'provider_new_pending_withdraw_requests'
diff --git a/tests/test_events.py b/tests/test_events.py
index bd79036b384..cef8987f113 100644
--- a/tests/test_events.py
+++ b/tests/test_events.py
@@ -11,7 +11,6 @@
FileAdded, FileRemoved, FolderCreated, FileUpdated,
AddonFileCopied, AddonFileMoved, AddonFileRenamed,
)
-from website.notifications.events import utils
from addons.base import signals
from framework.auth import Auth
from osf_tests import factories
@@ -58,9 +57,6 @@ def setUp(self):
]
}
- def test_list_of_files(self):
- assert ['e', 'f', 'c', 'd'] == utils.list_of_files(self.tree)
-
class TestEventExists(OsfTestCase):
# Add all possible called events here to ensure that the Event class can
@@ -112,21 +108,6 @@ def test_get_file_renamed(self):
assert isinstance(event, AddonFileRenamed)
-class TestSignalEvent(OsfTestCase):
- def setUp(self):
- super().setUp()
- self.user = factories.UserFactory()
- self.auth = Auth(user=self.user)
- self.node = factories.ProjectFactory(creator=self.user)
-
- @mock.patch('website.notifications.events.files.FileAdded.perform')
- def test_event_signal(self, mock_perform):
- signals.file_updated.send(
- user=self.user, target=self.node, event_type='file_added', payload=file_payload
- )
- assert mock_perform.called
-
-
class TestFileUpdated(OsfTestCase):
def setUp(self):
super().setUp()
@@ -138,7 +119,7 @@ def setUp(self):
self.sub = factories.NotificationSubscriptionFactory(
object_id=self.project.id,
content_type=ContentType.objects.get_for_model(self.project),
- notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_UPDATED)
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.FILE_UPDATED)
)
self.sub.save()
self.event = event_registry['file_updated'](self.user_2, self.project, 'file_updated', payload=file_payload)
@@ -148,11 +129,11 @@ def test_info_formed_correct(self):
assert f'updated file "{materialized.lstrip("/")}".' == self.event.html_message
assert f'updated file "{materialized.lstrip("/")}".' == self.event.text_message
- @mock.patch('website.notifications.emails.notify')
- def test_file_updated(self, mock_notify):
- self.event.perform()
- # notify('exd', 'file_updated', 'user', self.project, timezone.now())
- assert mock_notify.called
+ def test_file_updated(self):
+ with capture_notifications() as notifications:
+ self.event.perform()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.FILE_UPDATED
class TestFileAdded(OsfTestCase):
@@ -164,7 +145,7 @@ def setUp(self):
self.project_subscription = factories.NotificationSubscriptionFactory(
object_id=self.project.id,
content_type=ContentType.objects.get_for_model(self.project),
- notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_UPDATED)
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.FILE_UPDATED)
)
self.project_subscription.save()
self.user2 = factories.UserFactory()
@@ -175,11 +156,11 @@ def test_info_formed_correct(self):
assert f'added file "{materialized.lstrip("/")}".' == self.event.html_message
assert f'added file "{materialized.lstrip("/")}".' == self.event.text_message
- @mock.patch('website.notifications.emails.notify')
- def test_file_added(self, mock_notify):
- self.event.perform()
- # notify('exd', 'file_updated', 'user', self.project, timezone.now())
- assert mock_notify.called
+ def test_file_added(self):
+ with capture_notifications() as notification:
+ self.event.perform()
+ assert len(notification) == 1
+ assert notification[0]['type'] == NotificationType.Type.FILE_ADDED
class TestFileRemoved(OsfTestCase):
@@ -191,7 +172,7 @@ def setUp(self):
self.project_subscription = factories.NotificationSubscriptionFactory(
object_id=self.project.id,
content_type=ContentType.objects.get_for_model(self.project),
- notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_REMOVED)
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.FILE_REMOVED)
)
self.project_subscription.object_id = self.project.id
self.project_subscription.content_type = ContentType.objects.get_for_model(self.project)
@@ -202,21 +183,21 @@ def setUp(self):
)
def test_info_formed_correct_file(self):
- assert NotificationType.Type.NODE_FILE_UPDATED == self.event.event_type
+ assert NotificationType.Type.FILE_UPDATED == self.event.event_type
assert f'removed file "{materialized.lstrip("/")}".' == self.event.html_message
assert f'removed file "{materialized.lstrip("/")}".' == self.event.text_message
def test_info_formed_correct_folder(self):
- assert NotificationType.Type.NODE_FILE_UPDATED == self.event.event_type
+ assert NotificationType.Type.FILE_UPDATED == self.event.event_type
self.event.payload['metadata']['materialized'] += '/'
assert f'removed folder "{materialized.lstrip("/")}/".' == self.event.html_message
assert f'removed folder "{materialized.lstrip("/")}/".' == self.event.text_message
- @mock.patch('website.notifications.emails.notify')
- def test_file_removed(self, mock_notify):
- self.event.perform()
- # notify('exd', 'file_updated', 'user', self.project, timezone.now())
- assert mock_notify.called
+ def test_file_removed(self):
+ with capture_notifications() as notifications:
+ self.event.perform()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.FILE_REMOVED
class TestFolderCreated(OsfTestCase):
@@ -227,7 +208,7 @@ def setUp(self):
self.project = factories.ProjectFactory()
self.project_subscription = factories.NotificationSubscriptionFactory(
user=self.user,
- notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_UPDATED),
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.FILE_UPDATED),
)
self.project_subscription.save()
self.user2 = factories.UserFactory()
@@ -236,14 +217,15 @@ def setUp(self):
)
def test_info_formed_correct(self):
- assert NotificationType.Type.NODE_FILE_UPDATED == self.event.event_type
+ assert NotificationType.Type.FILE_UPDATED == self.event.event_type
assert 'created folder "Three/".' == self.event.html_message
assert 'created folder "Three/".' == self.event.text_message
- @mock.patch('website.notifications.emails.notify')
- def test_folder_added(self, mock_notify):
- self.event.perform()
- assert mock_notify.called
+ def test_folder_added(self):
+ with capture_notifications() as notifications:
+ self.event.perform()
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.FOLDER_CREATED
class TestFolderFileRenamed(OsfTestCase):
@@ -311,14 +293,14 @@ def setUp(self):
self.sub = factories.NotificationSubscriptionFactory(
object_id=self.project.id,
content_type=ContentType.objects.get_for_model(self.project),
- notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_UPDATED)
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.FILE_UPDATED)
)
self.sub.save()
# for private node
self.private_sub = factories.NotificationSubscriptionFactory(
object_id=self.private_node.id,
content_type=ContentType.objects.get_for_model(self.private_node),
- notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_UPDATED)
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.FILE_UPDATED)
)
self.private_sub.save()
# for file subscription
@@ -338,51 +320,53 @@ def test_info_formed_correct(self):
def test_user_performing_action_no_email(self):
# Move Event: Makes sure user who performed the action is not
# included in the notifications
- # self.sub.email_digest.add(self.user_2)
+ self.sub.user = self.user_2
self.sub.save()
with capture_notifications() as notifications:
self.event.perform()
- assert not notifications
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.ADDON_FILE_MOVED
+ assert notifications[0]['kwargs']['user'] == self.user_2
def test_perform_store_called_once(self):
- # self.sub.email_transactional.add(self.user_1)
+ self.sub.user = self.user_1
self.sub.save()
with capture_notifications() as notifications:
self.event.perform()
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_ADDON_FILE_MOVED
+ assert notifications[0]['type'] == NotificationType.Type.ADDON_FILE_MOVED
def test_perform_store_one_of_each(self):
# Move Event: Tests that store_emails is called 3 times, one in
# each category
- # self.sub.email_transactional.add(self.user_1)
+ self.sub.user = self.user_1
+ self.sub.save()
self.project.add_contributor(self.user_3, permissions=WRITE, auth=self.auth)
self.project.save()
self.private_node.add_contributor(self.user_3, permissions=WRITE, auth=self.auth)
self.private_node.save()
- # self.sub.email_digest.add(self.user_3)
+ self.sub.user = self.user_3
self.sub.save()
self.project.add_contributor(self.user_4, permissions=WRITE, auth=self.auth)
self.project.save()
- # self.file_sub.email_digest.add(self.user_4)
+ self.sub.user = self.user_4
+ self.sub.save()
self.file_sub.save()
with capture_notifications() as notifications:
self.event.perform()
- assert len(notifications) == 3
- assert notifications[0]['type'] == NotificationType.Type.NODE_FILE_UPDATED
- assert notifications[1]['type'] == NotificationType.Type.NODE_FILE_UPDATED
- assert notifications[2]['type'] == NotificationType.Type.NODE_FILE_UPDATED
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.ADDON_FILE_MOVED
def test_remove_user_sent_once(self):
# Move Event: Tests removed user is removed once. Regression
self.project.add_contributor(self.user_3, permissions=WRITE, auth=self.auth)
self.project.save()
- # self.file_sub.email_digest.add(self.user_3)
+ self.file_sub.user = self.user_3
self.file_sub.save()
with capture_notifications() as notifications:
self.event.perform()
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_ADDON_FILE_MOVED
+ assert notifications[0]['type'] == NotificationType.Type.ADDON_FILE_MOVED
class TestFileCopied(OsfTestCase):
@@ -407,14 +391,14 @@ def setUp(self):
self.sub = factories.NotificationSubscriptionFactory(
object_id=self.project.id,
content_type=ContentType.objects.get_for_model(self.project),
- notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_UPDATED)
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.FILE_UPDATED)
)
self.sub.save()
# for private node
self.private_sub = factories.NotificationSubscriptionFactory(
object_id=self.private_node.id,
content_type=ContentType.objects.get_for_model(self.private_node),
- notification_type=NotificationType.objects.get(name=NotificationType.Type.NODE_FILE_UPDATED)
+ notification_type=NotificationType.objects.get(name=NotificationType.Type.FILE_UPDATED)
)
self.private_sub.save()
# for file subscription
@@ -436,33 +420,34 @@ def test_info_correct(self):
' Storage in Consolidate.') == self.event.text_message
def test_copied_one_of_each(self):
- # Copy Event: Tests that store_emails is called 2 times, two with
+ # Copy Event: Tests that emit is called 2 times, two with
# permissions, one without
- # self.sub.email_transactional.add(self.user_1)
+ self.sub.user = self.user_1
+ self.sub.save()
self.project.add_contributor(self.user_3, permissions=WRITE, auth=self.auth)
self.project.save()
self.private_node.add_contributor(self.user_3, permissions=WRITE, auth=self.auth)
self.private_node.save()
- # self.sub.email_digest.add(self.user_3)
+ self.sub.user = self.user_3
self.sub.save()
self.project.add_contributor(self.user_4, permissions=WRITE, auth=self.auth)
self.project.save()
- # self.file_sub.email_digest.add(self.user_4)
+ self.file_sub.user = self.user_4
self.file_sub.save()
with capture_notifications() as notifications:
self.event.perform()
- assert len(notifications) == 2
- assert notifications[0]['type'] == NotificationType.Type.NODE_FILE_UPDATED
- assert notifications[1]['type'] == NotificationType.Type.NODE_FILE_UPDATED
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.ADDON_FILE_COPIED
def test_user_performing_action_no_email(self):
# Move Event: Makes sure user who performed the action is not
# included in the notifications
- # self.sub.email_digest.add(self.user_2)
+ self.sub.user = self.user_2
self.sub.save()
with capture_notifications() as notifications:
self.event.perform()
- assert not notifications
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.ADDON_FILE_COPIED
class TestSubscriptionManipulations(OsfTestCase):
@@ -495,33 +480,6 @@ def setUp(self):
self.dup_1_3 = {email_transactional: ['e1234', 'f1234'], 'none': ['h1234', 'g1234'],
'email_digest': ['a1234', 'c1234']}
- def test_subscription_user_difference(self):
- result = utils.subscriptions_users_difference(self.emails_1, self.emails_3)
- assert self.diff_1_3 == result
-
- def test_subscription_user_union(self):
- result = utils.subscriptions_users_union(self.emails_1, self.emails_2)
- assert set(self.union_1_2['email_transactional']) == set(result['email_transactional'])
- assert set(self.union_1_2['none']) == set(result['none'])
- assert set(self.union_1_2['email_digest']) == set(result['email_digest'])
-
- def test_remove_duplicates(self):
- result = utils.subscriptions_users_remove_duplicates(
- self.emails_1, self.emails_4, remove_same=False
- )
- assert set(self.dup_1_3['email_transactional']) == set(result['email_transactional'])
- assert set(self.dup_1_3['none']) == set(result['none'])
- assert set(self.dup_1_3['email_digest']) == set(result['email_digest'])
-
- def test_remove_duplicates_true(self):
- result = utils.subscriptions_users_remove_duplicates(
- self.emails_1, self.emails_1, remove_same=True
- )
-
- assert set(result['none']) == {'h1234', 'g1234', 'i1234'}
- assert result['email_digest'] == []
- assert result['email_transactional'] == []
-
wb_path = '5581cb50a24f710b0f4623f9'
materialized = '/One/Paper13.txt'
diff --git a/website/mails/mails.py b/website/mails/mails.py
index 033f23fc819..db684f7e84f 100644
--- a/website/mails/mails.py
+++ b/website/mails/mails.py
@@ -245,16 +245,6 @@ def get_english_article(word):
subject='Confirmation of your submission to ${provider_name}'
)
-ACCESS_REQUEST_SUBMITTED = Mail(
- 'access_request_submitted',
- subject='An OSF user has requested access to your ${node.project_or_component}'
-)
-
-ACCESS_REQUEST_DENIED = Mail(
- 'access_request_rejected',
- subject='Your access request to an OSF project has been declined'
-)
-
CROSSREF_ERROR = Mail(
'crossref_doi_error',
subject='There was an error creating a DOI for preprint(s). batch_id: ${batch_id}'
diff --git a/website/notifications/emails.py b/website/notifications/emails.py
index 7a22ba8954c..aee02dfc0e7 100644
--- a/website/notifications/emails.py
+++ b/website/notifications/emails.py
@@ -1,113 +1,11 @@
-from django.apps import apps
-
from babel import dates, core, Locale
from django.contrib.contenttypes.models import ContentType
-from osf.models import AbstractNode, NotificationSubscription, NotificationType
-from osf.models.notifications import NotificationDigest
+from osf.models import AbstractNode, NotificationSubscription
from osf.utils.permissions import READ
-from website import mails
from website.notifications import constants
-from website.notifications import utils
from website.util import web_url_for
-
-def notify(event, user, node, timestamp, **context):
- """Retrieve appropriate ***subscription*** and passe user list
-website/notifications/u
- :param event: event that triggered the notification
- :param user: user who triggered notification
- :param node: instance of Node
- :param timestamp: time event happened
- :param context: optional variables specific to templates
- target_user: used with comment_replies
- :return: List of user ids notifications were sent to
- """
- if event.endswith('_file_updated'):
- NotificationType.objects.get(
- name=NotificationType.Type.NODE_FILE_ADDED
- ).emit(
- user=user,
- subscribed_object=node,
- event_context=context
- )
-
-def store_emails(recipient_ids, notification_type, event, user, node, timestamp, abstract_provider=None, template=None, **context):
- """Store notification emails
-
- Emails are sent via celery beat as digests
- :param recipient_ids: List of user ids to send mail to.
- :param notification_type: from constants.Notification_types
- :param event: event that triggered notification
- :param user: user who triggered the notification
- :param node: instance of Node
- :param timestamp: time event happened
- :param context:
- :return: --
- """
- OSFUser = apps.get_model('osf', 'OSFUser')
-
- if notification_type == 'none':
- return
-
- # If `template` is not specified, default to using a template with name `event`
- template = f'{template or event}.html.mako'
-
- # user whose action triggered email sending
- context['user_fullname'] = user.fullname
- node_lineage_ids = get_node_lineage(node) if node else []
-
- for recipient_id in recipient_ids:
- if recipient_id == user._id:
- continue
- recipient = OSFUser.load(recipient_id)
- if recipient.is_disabled:
- continue
- context['localized_timestamp'] = localize_timestamp(timestamp, recipient)
- context['recipient_fullname'] = recipient.fullname
- message = mails.render_message(template, **context)
- digest = NotificationDigest(
- timestamp=timestamp,
- send_type=notification_type,
- event=event,
- user=recipient,
- message=message,
- node_lineage=node_lineage_ids,
- provider=abstract_provider
- )
- digest.save()
-
-
-def compile_subscriptions(node, event_type, event=None, level=0):
- """Recurse through node and parents for subscriptions.
-
- :param node: current node
- :param event_type: Generally node_subscriptions_available
- :param event: Particular event such a file_updated that has specific file subs
- :param level: How deep the recursion is
- :return: a dict of notification types with lists of users.
- """
- subscriptions = check_node(node, event_type)
- if event:
- subscriptions = check_node(node, event) # Gets particular event subscriptions
- parent_subscriptions = compile_subscriptions(node, event_type, level=level + 1) # get node and parent subs
- elif getattr(node, 'parent_id', False):
- parent_subscriptions = \
- compile_subscriptions(AbstractNode.load(node.parent_id), event_type, level=level + 1)
- else:
- parent_subscriptions = check_node(None, event_type)
- for notification_type in parent_subscriptions:
- p_sub_n = parent_subscriptions[notification_type]
- p_sub_n.extend(subscriptions[notification_type])
- for nt in subscriptions:
- if notification_type != nt:
- p_sub_n = list(set(p_sub_n).difference(set(subscriptions[nt])))
- if level == 0:
- p_sub_n, removed = utils.separate_users(node, p_sub_n)
- parent_subscriptions[notification_type] = p_sub_n
- return parent_subscriptions
-
-
def check_node(node, event):
"""Return subscription for a particular node and event."""
node_subscriptions = {key: [] for key in constants.NOTIFICATION_TYPES}
diff --git a/website/notifications/events/base.py b/website/notifications/events/base.py
index 7378c8ced43..9cf225ddffe 100644
--- a/website/notifications/events/base.py
+++ b/website/notifications/events/base.py
@@ -2,7 +2,7 @@
from django.utils import timezone
-from website.notifications import emails
+from osf.models import NotificationType
event_registry = {}
@@ -32,14 +32,15 @@ def __init__(self, user, node, action):
def perform(self):
"""Call emails.notify to notify users of an action"""
- emails.notify(
- event=self.event_type,
+ print(self.action)
+ NotificationType.objects.get(
+ name=self.action
+ ).emit(
user=self.user,
- node=self.node,
- timestamp=self.timestamp,
- message=self.html_message,
- profile_image_url=self.profile_image_url,
- url=self.url
+ event_context={
+ 'profile_image_url': self.profile_image_url,
+ 'action': self.action,
+ }
)
@property
diff --git a/website/notifications/events/files.py b/website/notifications/events/files.py
index aa8aca2f32b..95685b10b01 100644
--- a/website/notifications/events/files.py
+++ b/website/notifications/events/files.py
@@ -64,7 +64,7 @@ def text_message(self):
@property
def event_type(self):
"""Most basic event type."""
- return 'node_file_updated'
+ return 'file_updated'
@property
def waterbutler_id(self):
@@ -234,7 +234,7 @@ def perform(self):
return
NotificationType.objects.get(
- name=NotificationType.Type.NODE_ADDON_FILE_MOVED,
+ name=NotificationType.Type.ADDON_FILE_MOVED,
).emit(
user=self.user,
event_context={
@@ -260,7 +260,7 @@ def perform(self):
return
NotificationType.objects.get(
- name=NotificationType.Type.NODE_ADDON_FILE_MOVED,
+ name=NotificationType.Type.ADDON_FILE_MOVED,
).emit(
user=self.user,
event_context={
diff --git a/website/notifications/events/utils.py b/website/notifications/events/utils.py
deleted file mode 100644
index 83e4c79bce4..00000000000
--- a/website/notifications/events/utils.py
+++ /dev/null
@@ -1,141 +0,0 @@
-from itertools import product
-
-from website.notifications.emails import compile_subscriptions
-from website.notifications import utils, constants
-
-
-def get_file_subs_from_folder(addon, user, kind, path, name):
- """Find the file tree under a specified folder."""
- folder = dict(kind=kind, path=path, name=name)
- file_tree = addon._get_file_tree(filenode=folder, user=user, version='latest-published')
- return list_of_files(file_tree)
-
-
-def list_of_files(file_object):
- files = []
- if file_object['kind'] == 'file':
- return [file_object['path']]
- else:
- for child in file_object['children']:
- files.extend(list_of_files(child))
- return files
-
-
-def compile_user_lists(files, user, source_node, node):
- """Take multiple file ids and compiles them.
-
- :param files: List of WaterButler paths
- :param user: User who initiated action/event
- :param source_node: Node instance from
- :param node: Node instance to
- :return: move, warn, and remove dicts
- """
- # initialise subscription dictionaries
- move = {key: [] for key in constants.NOTIFICATION_TYPES}
- warn = {key: [] for key in constants.NOTIFICATION_TYPES}
- remove = {key: [] for key in constants.NOTIFICATION_TYPES}
- # get the node subscription
- if len(files) == 0:
- move, warn, remove = categorize_users(
- user, 'file_updated', source_node, 'file_updated', node
- )
- # iterate through file subscriptions
- for file_path in files:
- path = file_path.strip('/')
- t_move, t_warn, t_remove = categorize_users(
- user, path + '_file_updated', source_node,
- path + '_file_updated', node
- )
- # Add file subs to overall list of subscriptions
- for notification in constants.NOTIFICATION_TYPES:
- move[notification] = list(set(move[notification]).union(set(t_move[notification])))
- warn[notification] = list(set(warn[notification]).union(set(t_warn[notification])))
- remove[notification] = list(set(remove[notification]).union(set(t_remove[notification])))
- return move, warn, remove
-
-
-def categorize_users(user, source_event, source_node, event, node):
- """Categorize users from a file subscription into three categories.
-
- Puts users in one of three bins:
- - Moved: User has permissions on both nodes, subscribed to both
- - Warned: User has permissions on both, not subscribed to destination
- - Removed: Does not have permission on destination node
- :param user: User instance who started the event
- :param source_event: _event_name
- :param source_node: node from where the event happened
- :param event: new guid event name
- :param node: node where event ends up
- :return: Moved, to be warned, and removed users.
- """
- remove = utils.users_to_remove(source_event, source_node, node)
- source_node_subs = compile_subscriptions(source_node, utils.find_subscription_type(source_event))
- new_subs = compile_subscriptions(node, utils.find_subscription_type(source_event), event)
-
- # Moves users into the warn bucket or the move bucket
- move = subscriptions_users_union(source_node_subs, new_subs)
- warn = subscriptions_users_difference(source_node_subs, new_subs)
-
- # Removes users without permissions
- warn, remove = subscriptions_node_permissions(node, warn, remove)
-
- # Remove duplicates
- warn = subscriptions_users_remove_duplicates(warn, new_subs, remove_same=False)
- move = subscriptions_users_remove_duplicates(move, new_subs, remove_same=False)
-
- # Remove duplicates between move and warn; and move and remove
- move = subscriptions_users_remove_duplicates(move, warn, remove_same=True)
- move = subscriptions_users_remove_duplicates(move, remove, remove_same=True)
-
- for notifications in constants.NOTIFICATION_TYPES:
- # Remove the user who started this whole thing.
- user_id = user._id
- if user_id in warn[notifications]:
- warn[notifications].remove(user_id)
- if user_id in move[notifications]:
- move[notifications].remove(user_id)
- if user_id in remove[notifications]:
- remove[notifications].remove(user_id)
-
- return move, warn, remove
-
-
-def subscriptions_node_permissions(node, warn_subscription, remove_subscription):
- for notification in constants.NOTIFICATION_TYPES:
- subbed, removed = utils.separate_users(node, warn_subscription[notification])
- warn_subscription[notification] = subbed
- remove_subscription[notification].extend(removed)
- remove_subscription[notification] = list(set(remove_subscription[notification]))
- return warn_subscription, remove_subscription
-
-
-def subscriptions_users_union(emails_1, emails_2):
- return {
- notification:
- list(
- set(emails_1[notification]).union(set(emails_2[notification]))
- )
- for notification in constants.NOTIFICATION_TYPES.keys()
- }
-
-
-def subscriptions_users_difference(emails_1, emails_2):
- return {
- notification:
- list(
- set(emails_1[notification]).difference(set(emails_2[notification]))
- )
- for notification in constants.NOTIFICATION_TYPES.keys()
- }
-
-
-def subscriptions_users_remove_duplicates(emails_1, emails_2, remove_same=False):
- emails_list = dict(emails_1)
- product_list = product(constants.NOTIFICATION_TYPES, repeat=2)
- for notification_1, notification_2 in product_list:
- if notification_2 == notification_1 and not remove_same or notification_2 == 'none':
- continue
- emails_list[notification_1] = list(
- set(emails_list[notification_1]).difference(set(emails_2[notification_2]))
- )
- return emails_list
diff --git a/website/notifications/utils.py b/website/notifications/utils.py
index 7ccfcf88ede..fc565610777 100644
--- a/website/notifications/utils.py
+++ b/website/notifications/utils.py
@@ -41,7 +41,7 @@ def find_subscription_type(subscription):
"""
subs_available = constants.USER_SUBSCRIPTIONS_AVAILABLE
subs_available.extend(list({
- 'node_file_updated': 'Files updated'
+ 'file_updated': 'Files updated'
}.keys()))
for available in subs_available:
if available in subscription:
@@ -262,7 +262,7 @@ def format_data(user, nodes):
if can_read:
subscriptions = NotificationSubscription.objects.filter(
user=user,
- notification_type__name='node_file_updated',
+ notification_type__name='file_updated',
user__isnull=True,
object_id=node.id,
content_type=ContentType.objects.get_for_model(node)
@@ -331,7 +331,7 @@ def format_file_subscription(user, node_id, path, provider):
return serialize_event(user, node=node, event_description='file_updated')
-all_subs = ['node_file_updated']
+all_subs = ['file_updated']
all_subs += constants.USER_SUBSCRIPTIONS_AVAILABLE
def serialize_event(user, subscription=None, node=None, event_description=None):
@@ -429,7 +429,7 @@ def subscribe_user_to_notifications(node, user):
)
NotificationSubscription.objects.get_or_create(
user=user,
- notification_type__name=NotificationType.Type.NODE_FILE_UPDATED,
+ notification_type__name=NotificationType.Type.FILE_UPDATED,
object_id=node.id,
content_type=ContentType.objects.get_for_model(node)
)
diff --git a/website/project/views/comment.py b/website/project/views/comment.py
index 5e274052f18..968f8cb7c2e 100644
--- a/website/project/views/comment.py
+++ b/website/project/views/comment.py
@@ -14,7 +14,8 @@
@file_updated.connect
-def update_file_guid_referent(self, target, event_type, payload, user=None):
+def update_file_guid_referent(self, target, payload, user=None):
+ event_type = payload['action']
if event_type not in ('addon_file_moved', 'addon_file_renamed'):
return # Nothing to do
From 7c792223fec4a4f3f41c2a642716cf34d93b250f Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 29 Jul 2025 14:14:45 -0400
Subject: [PATCH 139/336] fix contributor email notifications with new throttle
---
...ontributor_added_email_records_and_more.py | 25 +++++++++
osf/models/user.py | 14 -----
osf_tests/test_merging_users.py | 1 -
tests/test_adding_contributor_views.py | 37 ++++++-------
website/notifications/events/base.py | 1 -
website/project/views/contributor.py | 53 ++++---------------
6 files changed, 51 insertions(+), 80 deletions(-)
create mode 100644 osf/migrations/0035_remove_osfuser_contributor_added_email_records_and_more.py
diff --git a/osf/migrations/0035_remove_osfuser_contributor_added_email_records_and_more.py b/osf/migrations/0035_remove_osfuser_contributor_added_email_records_and_more.py
new file mode 100644
index 00000000000..48fd5f258da
--- /dev/null
+++ b/osf/migrations/0035_remove_osfuser_contributor_added_email_records_and_more.py
@@ -0,0 +1,25 @@
+# Generated by Django 4.2.13 on 2025-07-29 17:41
+
+from django.db import migrations
+
+
+class Migration(migrations.Migration):
+
+ dependencies = [
+ ('osf', '0034_remove_abstractnode_child_node_subscriptions'),
+ ]
+
+ operations = [
+ migrations.RemoveField(
+ model_name='osfuser',
+ name='contributor_added_email_records',
+ ),
+ migrations.RemoveField(
+ model_name='osfuser',
+ name='group_connected_email_records',
+ ),
+ migrations.RemoveField(
+ model_name='osfuser',
+ name='member_added_email_records',
+ ),
+ ]
diff --git a/osf/models/user.py b/osf/models/user.py
index fc3526d71f1..8dfaff58a44 100644
--- a/osf/models/user.py
+++ b/osf/models/user.py
@@ -226,20 +226,6 @@ class OSFUser(DirtyFieldsMixin, GuidMixin, BaseModel, AbstractBaseUser, Permissi
# ...
# }
- # Time of last sent notification email to newly added contributors
- # Format : {
- # : {
- # 'last_sent': time.time()
- # }
- # ...
- # }
- contributor_added_email_records = DateTimeAwareJSONField(default=dict, blank=True)
-
- # Tracks last email sent where user was added to an OSF Group
- member_added_email_records = DateTimeAwareJSONField(default=dict, blank=True)
- # Tracks last email sent where an OSF Group was connected to a node
- group_connected_email_records = DateTimeAwareJSONField(default=dict, blank=True)
-
# The user into which this account was merged
merged_by = models.ForeignKey('self', null=True, blank=True, related_name='merger', on_delete=models.CASCADE)
diff --git a/osf_tests/test_merging_users.py b/osf_tests/test_merging_users.py
index e51e922ec62..9317260fb1b 100644
--- a/osf_tests/test_merging_users.py
+++ b/osf_tests/test_merging_users.py
@@ -138,7 +138,6 @@ def is_mrm_field(value):
'username',
'verification_key',
'verification_key_v2',
- 'contributor_added_email_records',
'requested_deactivation',
]
diff --git a/tests/test_adding_contributor_views.py b/tests/test_adding_contributor_views.py
index 62e84e916fc..bb59a2eeef5 100644
--- a/tests/test_adding_contributor_views.py
+++ b/tests/test_adding_contributor_views.py
@@ -3,7 +3,6 @@
import pytest
from django.core.exceptions import ValidationError
-from pytest import approx
from rest_framework import status as http_status
from framework import auth
@@ -28,6 +27,7 @@
)
from tests.utils import capture_notifications
from website.profile.utils import add_contributor_json, serialize_unregistered
+from website import settings
from website.project.views.contributor import (
deserialize_contributors,
notify_added_contributor,
@@ -189,10 +189,8 @@ def test_add_contributors_post_only_sends_one_email_to_unreg_user(self, mock_sen
assert self.project.can_edit(user=self.creator)
with capture_notifications() as noitification:
self.app.post(url, json=payload, auth=self.creator.auth)
- assert len(noitification) == 3
+ assert len(noitification) == 1
assert noitification[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
- assert noitification[1]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
- assert noitification[2]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
def test_add_contributors_post_only_sends_one_email_to_registered_user(self):
# Project has components
@@ -218,10 +216,8 @@ def test_add_contributors_post_only_sends_one_email_to_registered_user(self):
assert self.project.can_edit(user=self.creator)
with capture_notifications() as notifications:
self.app.post(url, json=payload, auth=self.creator.auth)
- assert len(notifications) == 3
+ assert len(notifications) == 1
assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
- assert notifications[1]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
- assert notifications[2]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
def test_add_contributors_post_sends_email_if_user_not_contributor_on_parent_node(self):
@@ -250,9 +246,8 @@ def test_add_contributors_post_sends_email_if_user_not_contributor_on_parent_nod
self.app.post(url, json=payload, auth=self.creator.auth)
# send_mail is called for both the project and the sub-component
- assert len(notifications) == 2
+ assert len(notifications) == 1
assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
- assert notifications[1]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
@mock.patch('website.project.views.contributor.send_claim_email')
@@ -288,8 +283,6 @@ def test_email_sent_when_reg_user_is_added(self):
project.save()
assert len(notifications) == 1
assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
- contributor.refresh_from_db()
- assert contributor.contributor_added_email_records[project._id]['last_sent'] == approx(int(time.time()), rel=1)
def test_contributor_added_email_sent_to_unreg_user(self):
unreg_user = UnregUserFactory()
@@ -345,17 +338,17 @@ def test_notify_contributor_email_sends_after_throttle_expires(self):
contributor = UserFactory()
project = ProjectFactory()
auth = Auth(project.creator)
- with capture_notifications() as notifications:
- notify_added_contributor(project, contributor, NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT, auth, throttle=throttle)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
-
- time.sleep(1) # throttle period expires
- with capture_notifications() as notifications:
- notify_added_contributor(project, contributor, NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT, auth, throttle=throttle)
- assert len(notifications) == 2
- assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
- assert notifications[1]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ with mock.patch.object(settings, 'CONTRIBUTOR_ADDED_EMAIL_THROTTLE', 1):
+ with capture_notifications() as notifications:
+ notify_added_contributor(project, contributor, NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT, auth, throttle=throttle)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+
+ time.sleep(settings.CONTRIBUTOR_ADDED_EMAIL_THROTTLE) # throttle period expires
+ with capture_notifications() as notifications:
+ notify_added_contributor(project, contributor, NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT, auth, throttle=throttle)
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
def test_add_contributor_to_fork_sends_email(self):
contributor = UserFactory()
diff --git a/website/notifications/events/base.py b/website/notifications/events/base.py
index 9cf225ddffe..00e93b46ed1 100644
--- a/website/notifications/events/base.py
+++ b/website/notifications/events/base.py
@@ -32,7 +32,6 @@ def __init__(self, user, node, action):
def perform(self):
"""Call emails.notify to notify users of an action"""
- print(self.action)
NotificationType.objects.get(
name=self.action
).emit(
diff --git a/website/project/views/contributor.py b/website/project/views/contributor.py
index 715044063e8..d0a217d1cc1 100644
--- a/website/project/views/contributor.py
+++ b/website/project/views/contributor.py
@@ -438,9 +438,6 @@ def send_claim_registered_email(claimer, unclaimed_user, node, throttle=24 * 360
'osf_contact_email': settings.OSF_CONTACT_EMAIL,
}
)
- referrer.contributor_added_email_records = {node._id: {'last_sent': get_timestamp()}}
- referrer.save()
-
# Send mail to claimer, telling them to wait for referrer
NotificationType.objects.get(
name=NotificationType.Type.USER_PENDING_VERIFICATION_REGISTERED
@@ -457,19 +454,6 @@ def send_claim_registered_email(claimer, unclaimed_user, node, throttle=24 * 360
}
)
-def check_email_throttle_claim_email(node, contributor):
- contributor_record = contributor.contributor_added_email_records.get(node._id, {})
- if contributor_record:
- timestamp = contributor_record.get('last_sent', None)
- if timestamp:
- if not throttle_period_expired(
- timestamp,
- settings.CONTRIBUTOR_ADDED_EMAIL_THROTTLE
- ):
- return True
- else:
- contributor.contributor_added_email_records[node._id] = {}
-
def send_claim_email(
email,
unclaimed_user,
@@ -568,7 +552,7 @@ def send_claim_email(
)
-def check_email_throttle(node, contributor, throttle=None):
+def check_email_throttle(node, contributor, notification_type):
"""
Check whether a 'contributor added' notification was sent recently
(within the throttle period) for the given node and contributor.
@@ -576,36 +560,22 @@ def check_email_throttle(node, contributor, throttle=None):
Args:
node (AbstractNode): The node to check.
contributor (OSFUser): The contributor being notified.
- throttle (int, optional): Throttle period in seconds (defaults to CONTRIBUTOR_ADDED_EMAIL_THROTTLE setting).
+ notification_type (str, optional): What type of notification to check for.
Returns:
bool: True if throttled (email was sent recently), False otherwise.
"""
- from osf.models import Notification, NotificationType, NotificationSubscription
+ from osf.models import Notification, NotificationSubscription
from website import settings
- throttle = throttle or settings.CONTRIBUTOR_ADDED_EMAIL_THROTTLE
-
- try:
- notification_type = NotificationType.objects.get(
- name=NotificationType.Type.NODE_COMMENT.value
- )
- except NotificationType.DoesNotExist:
- return False # Fail-safe: if the notification type isn't set up, don't throttle
- from django.contrib.contenttypes.models import ContentType
from datetime import timedelta
-
# Check for an active subscription for this contributor and this node
- subscription = NotificationSubscription.objects.filter(
+ subscription, create = NotificationSubscription.objects.get_or_create(
user=contributor,
- notification_type=notification_type,
- content_type=ContentType.objects.get_for_model(node),
- object_id=node.id
- ).first()
-
- if not subscription:
+ notification_type__name=notification_type,
+ )
+ if create:
return False # No subscription means no previous notifications, so no throttling
-
# Check the most recent Notification for this subscription
last_notification = Notification.objects.filter(
subscription=subscription,
@@ -613,7 +583,7 @@ def check_email_throttle(node, contributor, throttle=None):
).order_by('-sent').first()
if last_notification and last_notification.sent:
- cutoff_time = timezone.now() - timedelta(seconds=throttle)
+ cutoff_time = timezone.now() - timedelta(seconds=settings.CONTRIBUTOR_ADDED_EMAIL_THROTTLE)
return last_notification.sent > cutoff_time
return False # No previous sent notification, not throttled
@@ -633,16 +603,12 @@ def notify_added_contributor(node, contributor, notification_type, auth=None, *a
auth (Auth, optional): Authorization context.
notification_type (str, optional): Template identifier.
"""
- if check_email_throttle_claim_email(node, contributor):
- return
if not notification_type:
return
- # Default values
notification_type = notification_type or NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
logo = settings.OSF_LOGO
- # Use match for notification type/logic
if notification_type == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT:
pass
elif notification_type == NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT:
@@ -659,6 +625,9 @@ def notify_added_contributor(node, contributor, notification_type, auth=None, *a
else:
raise NotImplementedError(f'notification_type: {notification_type} not implemented.')
+ if check_email_throttle(node, contributor, notification_type):
+ return
+
NotificationType.objects.get(name=notification_type).emit(
user=contributor,
event_context={
From e04d901fd60e01c33348291bfc528e0eafd5f9e7 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 29 Jul 2025 14:50:32 -0400
Subject: [PATCH 140/336] remove old notification routes and routes
---
addons/base/views.py | 3 +-
.../0036_delete_notificationdigest.py | 16 +
osf/models/__init__.py | 2 +-
osf/models/mixins.py | 2 +-
osf/models/notification_subscription.py | 5 -
osf/models/notifications.py | 71 +---
osf/models/validators.py | 4 +-
osf_tests/factories.py | 15 +-
osf_tests/test_comment.py | 38 +-
tests/test_events.py | 3 -
website/notifications/constants.py | 26 --
website/notifications/emails.py | 98 -----
website/notifications/events/base.py | 4 -
website/notifications/events/files.py | 3 +-
website/notifications/exceptions.py | 7 -
website/notifications/listeners.py | 30 +-
website/notifications/utils.py | 398 +-----------------
website/notifications/views.py | 106 -----
website/routes.py | 26 --
19 files changed, 69 insertions(+), 788 deletions(-)
create mode 100644 osf/migrations/0036_delete_notificationdigest.py
delete mode 100644 website/notifications/emails.py
delete mode 100644 website/notifications/exceptions.py
delete mode 100644 website/notifications/views.py
diff --git a/addons/base/views.py b/addons/base/views.py
index 4c3d01bacdf..f91ae0ce2ce 100644
--- a/addons/base/views.py
+++ b/addons/base/views.py
@@ -56,7 +56,6 @@
from osf.metrics import PreprintView, PreprintDownload
from osf.utils import permissions
from osf.external.gravy_valet import request_helpers
-from website.notifications.emails import localize_timestamp
from website.profile.utils import get_profile_image_url
from website.project import decorators
from website.project.decorators import must_be_contributor_or_public, must_be_valid_project, check_contributor_auth
@@ -636,7 +635,7 @@ def create_waterbutler_log(payload, **kwargs):
user=user,
event_context={
'profile_image_url': user.profile_image_url(),
- 'localized_timestamp': localize_timestamp(timezone.now(), user),
+ 'localized_timestamp': timezone.now(),
'user_fullname': user.fullname,
'url': node.absolute_url,
}
diff --git a/osf/migrations/0036_delete_notificationdigest.py b/osf/migrations/0036_delete_notificationdigest.py
new file mode 100644
index 00000000000..8ab718d12d6
--- /dev/null
+++ b/osf/migrations/0036_delete_notificationdigest.py
@@ -0,0 +1,16 @@
+# Generated by Django 4.2.13 on 2025-07-29 18:25
+
+from django.db import migrations
+
+
+class Migration(migrations.Migration):
+
+ dependencies = [
+ ('osf', '0035_remove_osfuser_contributor_added_email_records_and_more'),
+ ]
+
+ operations = [
+ migrations.DeleteModel(
+ name='NotificationDigest',
+ ),
+ ]
diff --git a/osf/models/__init__.py b/osf/models/__init__.py
index 669059d9c4c..7e02185c4ff 100644
--- a/osf/models/__init__.py
+++ b/osf/models/__init__.py
@@ -62,7 +62,7 @@
from .node_relation import NodeRelation
from .nodelog import NodeLog
from .notable_domain import NotableDomain, DomainReference
-from .notifications import NotificationDigest, NotificationSubscriptionLegacy
+from .notifications import NotificationSubscriptionLegacy
from .notification_subscription import NotificationSubscription
from .notification_type import NotificationType
from .notification import Notification
diff --git a/osf/models/mixins.py b/osf/models/mixins.py
index 0dc9f1c1361..0bcf35330b3 100644
--- a/osf/models/mixins.py
+++ b/osf/models/mixins.py
@@ -1110,7 +1110,7 @@ def remove_user_from_subscription(self, user, subscription):
user=user
)
if subscriptions:
- subscriptions.get().remove_user_from_subscription()
+ subscriptions.get().delete()
class TaxonomizableMixin(models.Model):
diff --git a/osf/models/notification_subscription.py b/osf/models/notification_subscription.py
index 7dc79047a13..41b88ba9ea2 100644
--- a/osf/models/notification_subscription.py
+++ b/osf/models/notification_subscription.py
@@ -99,8 +99,3 @@ def _id(self):
return f'{self.user._id}_global'
case _:
raise NotImplementedError()
-
- def remove_user_from_subscription(self):
- """
- """
- self.delete()
diff --git a/osf/models/notifications.py b/osf/models/notifications.py
index 80703f1620f..be89d26248f 100644
--- a/osf/models/notifications.py
+++ b/osf/models/notifications.py
@@ -1,13 +1,6 @@
-from django.contrib.postgres.fields import ArrayField
from django.db import models
-from website.notifications.constants import NOTIFICATION_TYPES
-from .node import Node
-from .user import OSFUser
-from .base import BaseModel, ObjectIDMixin
-from .validators import validate_subscription_type
-from osf.utils.fields import NonNaiveDateTimeField
-from website.util import api_v2_url
+from .base import BaseModel
class NotificationSubscriptionLegacy(BaseModel):
@@ -31,65 +24,3 @@ class Meta:
# Both PreprintProvider and RegistrationProvider default instances use "osf" as their `_id`
unique_together = ('_id', 'provider')
db_table = 'osf_notificationsubscription_legacy'
-
- @classmethod
- def load(cls, q):
- # modm doesn't throw exceptions when loading things that don't exist
- try:
- return cls.objects.get(_id=q)
- except cls.DoesNotExist:
- return None
-
- @property
- def owner(self):
- # ~100k have owner==user
- if self.user is not None:
- return self.user
- # ~8k have owner=Node
- elif self.node is not None:
- return self.node
-
- @owner.setter
- def owner(self, value):
- if isinstance(value, OSFUser):
- self.user = value
- elif isinstance(value, Node):
- self.node = value
-
- @property
- def absolute_api_v2_url(self):
- path = f'/subscriptions/{self._id}/'
- return api_v2_url(path)
-
- def add_user_to_subscription(self, user, notification_type, save=True):
- for nt in NOTIFICATION_TYPES:
- if getattr(self, nt).filter(id=user.id).exists():
- if nt != notification_type:
- getattr(self, nt).remove(user)
- else:
- if nt == notification_type:
- getattr(self, nt).add(user)
-
- if save:
- # Do not clean legacy objects
- self.save(clean=False)
-
- def remove_user_from_subscription(self, user, save=True):
- for notification_type in NOTIFICATION_TYPES:
- try:
- getattr(self, notification_type, []).remove(user)
- except ValueError:
- pass
-
- if save:
- self.save()
-
-class NotificationDigest(ObjectIDMixin, BaseModel):
- user = models.ForeignKey('OSFUser', null=True, blank=True, on_delete=models.CASCADE)
- provider = models.ForeignKey('AbstractProvider', null=True, blank=True, on_delete=models.CASCADE)
- timestamp = NonNaiveDateTimeField()
- send_type = models.CharField(max_length=50, db_index=True, validators=[validate_subscription_type, ])
- event = models.CharField(max_length=50)
- message = models.TextField()
- # TODO: Could this be a m2m with or without an order field?
- node_lineage = ArrayField(models.CharField(max_length=31))
diff --git a/osf/models/validators.py b/osf/models/validators.py
index 87f00f826a6..29ee184b66e 100644
--- a/osf/models/validators.py
+++ b/osf/models/validators.py
@@ -8,8 +8,6 @@
from django.utils.deconstruct import deconstructible
from rest_framework import exceptions
-from website.notifications.constants import NOTIFICATION_TYPES
-
from osf.utils.registrations import FILE_VIEW_URL_REGEX
from osf.utils.sanitize import strip_html
from osf.exceptions import ValidationError, ValidationValueError, reraise_django_validation_errors, BlockedEmailError
@@ -54,7 +52,7 @@ def string_required(value):
def validate_subscription_type(value):
- if value not in NOTIFICATION_TYPES:
+ if value not in ['email_transactional', 'email_digest', 'none']:
raise ValidationValueError
diff --git a/osf_tests/factories.py b/osf_tests/factories.py
index d1c7e640250..cced02e978d 100644
--- a/osf_tests/factories.py
+++ b/osf_tests/factories.py
@@ -6,7 +6,7 @@
from unittest import mock
from factory import SubFactory
-from factory.fuzzy import FuzzyDateTime, FuzzyAttribute, FuzzyChoice
+from factory.fuzzy import FuzzyDateTime, FuzzyChoice
from unittest.mock import patch, Mock
import pytz
@@ -20,7 +20,6 @@
from django.db.utils import IntegrityError
from faker import Factory, Faker
from waffle.models import Flag, Sample, Switch
-from website.notifications.constants import NOTIFICATION_TYPES
from osf.utils import permissions
from website.archiver import ARCHIVER_SUCCESS
from website.settings import FAKE_EMAIL_NAME, FAKE_EMAIL_DOMAIN
@@ -1064,18 +1063,6 @@ def make_node_lineage():
return [node1._id, node2._id, node3._id, node4._id]
-
-class NotificationDigestFactory(DjangoModelFactory):
- timestamp = FuzzyDateTime(datetime.datetime(1970, 1, 1, tzinfo=pytz.UTC))
- node_lineage = FuzzyAttribute(fuzzer=make_node_lineage)
- user = factory.SubFactory(UserFactory)
- send_type = FuzzyChoice(choices=NOTIFICATION_TYPES.keys())
- message = fake.text(max_nb_chars=2048)
- event = fake.text(max_nb_chars=50)
- class Meta:
- model = models.NotificationDigest
-
-
class ConferenceFactory(DjangoModelFactory):
class Meta:
model = models.Conference
diff --git a/osf_tests/test_comment.py b/osf_tests/test_comment.py
index bb11d34591c..62a295367fd 100644
--- a/osf_tests/test_comment.py
+++ b/osf_tests/test_comment.py
@@ -500,7 +500,7 @@ def test_comments_move_on_file_rename(self, project, user):
}
self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_renamed', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
assert self.guid._id == file_node.get_guid()._id
@@ -521,7 +521,7 @@ def test_comments_move_on_folder_rename(self, project, user):
file_name = 'file.txt'
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_renamed', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
@@ -543,7 +543,7 @@ def test_comments_move_on_subfolder_file_when_parent_folder_is_renamed(self, pro
file_path = 'sub-subfolder/file.txt'
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_path), user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_renamed', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_path), file_id=self.file._id))
@@ -564,7 +564,7 @@ def test_comments_move_when_file_moved_to_subfolder(self, project, user):
}
self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
@@ -585,7 +585,7 @@ def test_comments_move_when_file_moved_from_subfolder_to_root(self, project, use
}
self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
@@ -606,7 +606,7 @@ def test_comments_move_when_file_moved_from_project_to_component(self, project,
}
self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
@@ -628,7 +628,7 @@ def test_comments_move_when_file_moved_from_component_to_project(self, project,
}
self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
@@ -651,7 +651,7 @@ def test_comments_move_when_folder_moved_to_subfolder(self, user, project):
file_name = 'file.txt'
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
@@ -673,7 +673,7 @@ def test_comments_move_when_folder_moved_from_subfolder_to_root(self, project, u
file_name = 'file.txt'
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
@@ -695,7 +695,7 @@ def test_comments_move_when_folder_moved_from_project_to_component(self, project
file_name = 'file.txt'
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
@@ -717,7 +717,7 @@ def test_comments_move_when_folder_moved_from_component_to_project(self, project
file_name = 'file.txt'
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
@@ -751,7 +751,7 @@ def test_comments_move_when_file_moved_to_osfstorage(self, project, user):
}
self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
payload = self._create_payload('move', user, source, destination, self.file._id, destination_file_id=destination['path'].strip('/'))
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class('osfstorage', BaseFileNode.FILE).get_or_create(destination['node'], destination['path'])
@@ -792,7 +792,7 @@ def test_comments_move_when_folder_moved_to_osfstorage(self, project, user):
file_name = 'file.txt'
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
payload = self._create_payload('move', user, source, destination, self.file._id, destination_file_id=osf_file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class('osfstorage', BaseFileNode.FILE).get_or_create(destination['node'], osf_file._id)
@@ -827,7 +827,7 @@ def test_comments_move_when_file_moved_to_different_provider(self, destination_p
}
self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(destination_provider, BaseFileNode.FILE).get_or_create(destination['node'], destination['path'])
@@ -868,7 +868,7 @@ def test_comments_move_when_folder_moved_to_different_provider(self, destination
file_name = 'file.txt'
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(destination_provider, BaseFileNode.FILE).get_or_create(destination['node'], destination_path)
@@ -919,7 +919,7 @@ def test_comments_move_when_file_moved_from_project_to_component(self, project,
self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
self.file.move_under(destination['node'].get_addon(self.provider).get_root())
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
@@ -942,7 +942,7 @@ def test_comments_move_when_file_moved_from_component_to_project(self, project,
self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
self.file.move_under(destination['node'].get_addon(self.provider).get_root())
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
@@ -966,7 +966,7 @@ def test_comments_move_when_folder_moved_from_project_to_component(self, project
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
self.file.move_under(destination['node'].get_addon(self.provider).get_root())
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
@@ -989,7 +989,7 @@ def test_comments_move_when_folder_moved_from_component_to_project(self, project
self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
self.file.move_under(destination['node'].get_addon(self.provider).get_root())
payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], event_type='addon_file_moved', payload=payload)
+ update_file_guid_referent(self=None, target=destination['node'], payload=payload)
self.guid.reload()
file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
diff --git a/tests/test_events.py b/tests/test_events.py
index cef8987f113..e2f81f81a21 100644
--- a/tests/test_events.py
+++ b/tests/test_events.py
@@ -1,7 +1,5 @@
from collections import OrderedDict
-from unittest import mock
-
from django.contrib.contenttypes.models import ContentType
from osf.models import NotificationType
@@ -11,7 +9,6 @@
FileAdded, FileRemoved, FolderCreated, FileUpdated,
AddonFileCopied, AddonFileMoved, AddonFileRenamed,
)
-from addons.base import signals
from framework.auth import Auth
from osf_tests import factories
from osf.utils.permissions import WRITE
diff --git a/website/notifications/constants.py b/website/notifications/constants.py
index 3b0b81d6823..6e05855582b 100644
--- a/website/notifications/constants.py
+++ b/website/notifications/constants.py
@@ -1,32 +1,6 @@
-USER_SUBSCRIPTIONS_AVAILABLE = [
- 'user_file_updated',
- 'user_reviews'
-]
-
-PROVIDER_SUBSCRIPTIONS_AVAILABLE = {
- 'provider_new_pending_submissions': 'New preprint submissions for moderators to review.'
-}
-
# Note: the python value None mean inherit from parent
NOTIFICATION_TYPES = {
'email_transactional': 'Email when a change occurs',
'email_digest': 'Daily email digest of all changes to this project',
'none': 'None'
}
-
-# Formatted file provider names for notification emails
-PROVIDERS = {
- 'osfstorage': 'OSF Storage',
- 'boa': 'Boa',
- 'box': 'Box',
- 'dataverse': 'Dataverse',
- 'dropbox': 'Dropbox',
- 'figshare': 'figshare',
- 'github': 'GitHub',
- 'gitlab': 'GitLab',
- 'bitbucket': 'Bitbucket',
- 'googledrive': 'Google Drive',
- 'owncloud': 'ownCloud',
- 'onedrive': 'Microsoft OneDrive',
- 's3': 'Amazon S3'
-}
diff --git a/website/notifications/emails.py b/website/notifications/emails.py
deleted file mode 100644
index aee02dfc0e7..00000000000
--- a/website/notifications/emails.py
+++ /dev/null
@@ -1,98 +0,0 @@
-from babel import dates, core, Locale
-from django.contrib.contenttypes.models import ContentType
-
-from osf.models import AbstractNode, NotificationSubscription
-from osf.utils.permissions import READ
-from website.notifications import constants
-from website.util import web_url_for
-
-def check_node(node, event):
- """Return subscription for a particular node and event."""
- node_subscriptions = {key: [] for key in constants.NOTIFICATION_TYPES}
- if node:
- subscription = NotificationSubscription.objects.filter(
- object_id=node.id,
- content_type=ContentType.objects.get_for_model(node),
- notification_type__name=event
- )
- for notification_type in node_subscriptions:
- users = getattr(subscription, notification_type, [])
- if users:
- for user in users.exclude(date_disabled__isnull=False):
- if node.has_permission(user, READ):
- node_subscriptions[notification_type].append(user._id)
- return node_subscriptions
-
-
-def get_user_subscriptions(user, event):
- if user.is_disabled:
- return {}
- user_subscription, _ = NotificationSubscription.objects.get_or_create(
- user=user,
- notification_type__name=event
- )
- return user_subscription
-
-
-def get_node_lineage(node):
- """ Get a list of node ids in order from the node to top most project
- e.g. [parent._id, node._id]
- """
- from osf.models import Preprint
- lineage = [node._id]
- if isinstance(node, Preprint):
- return lineage
-
- while node.parent_id:
- node = node.parent_node
- lineage = [node._id] + lineage
-
- return lineage
-
-
-def get_settings_url(uid, user):
- if uid == user._id:
- return web_url_for('user_notifications', _absolute=True)
-
- node = AbstractNode.load(uid)
- assert node, 'get_settings_url recieved an invalid Node id'
- return node.web_url_for('node_setting', _guid=True, _absolute=True)
-
-def fix_locale(locale):
- """Atempt to fix a locale to have the correct casing, e.g. de_de -> de_DE
-
- This is NOT guaranteed to return a valid locale identifier.
- """
- try:
- language, territory = locale.split('_', 1)
- except ValueError:
- return locale
- else:
- return '_'.join([language, territory.upper()])
-
-def localize_timestamp(timestamp, user):
- try:
- user_timezone = dates.get_timezone(user.timezone)
- except LookupError:
- user_timezone = dates.get_timezone('Etc/UTC')
-
- try:
- user_locale = Locale(user.locale)
- except core.UnknownLocaleError:
- user_locale = Locale('en')
-
- # Do our best to find a valid locale
- try:
- user_locale.date_formats
- except OSError: # An IOError will be raised if locale's casing is incorrect, e.g. de_de vs. de_DE
- # Attempt to fix the locale, e.g. de_de -> de_DE
- try:
- user_locale = Locale(fix_locale(user.locale))
- user_locale.date_formats
- except (core.UnknownLocaleError, OSError):
- user_locale = Locale('en')
-
- formatted_date = dates.format_date(timestamp, format='full', locale=user_locale)
- formatted_time = dates.format_time(timestamp, format='short', tzinfo=user_timezone, locale=user_locale)
-
- return f'{formatted_time} on {formatted_date}'
diff --git a/website/notifications/events/base.py b/website/notifications/events/base.py
index 00e93b46ed1..2d36e74ba15 100644
--- a/website/notifications/events/base.py
+++ b/website/notifications/events/base.py
@@ -64,7 +64,3 @@ def event_type(self):
Examples:
_file_updated"""
raise NotImplementedError
-
-
-class RegistryError(TypeError):
- pass
diff --git a/website/notifications/events/files.py b/website/notifications/events/files.py
index 95685b10b01..d88cf3441e4 100644
--- a/website/notifications/events/files.py
+++ b/website/notifications/events/files.py
@@ -13,7 +13,6 @@
register,
Event,
event_registry,
- RegistryError,
)
from osf.models import AbstractNode, NodeLog, Preprint, NotificationType
from addons.base.signals import file_updated as signal
@@ -24,7 +23,7 @@ def file_updated(self, target=None, user=None, event_type=None, payload=None):
if isinstance(target, Preprint):
return
if event_type not in event_registry:
- raise RegistryError
+ raise NotImplementedError(f' {event_type} not in {event_registry}')
event = event_registry[event_type](user, target, event_type, payload=payload)
event.perform()
diff --git a/website/notifications/exceptions.py b/website/notifications/exceptions.py
deleted file mode 100644
index 573a58164d3..00000000000
--- a/website/notifications/exceptions.py
+++ /dev/null
@@ -1,7 +0,0 @@
-from osf.exceptions import OSFError
-
-class InvalidSubscriptionError(OSFError):
- """Raised if an invalid subscription is attempted. e.g. attempt to
- subscribe to an invalid target: institution, bookmark, deleted project etc.
- """
- pass
diff --git a/website/notifications/listeners.py b/website/notifications/listeners.py
index ca9fdcd6807..2ed837308bb 100644
--- a/website/notifications/listeners.py
+++ b/website/notifications/listeners.py
@@ -1,7 +1,9 @@
import logging
from django.apps import apps
+from django.contrib.contenttypes.models import ContentType
+from osf.models import NotificationSubscription, NotificationType
from website.project.signals import contributor_added, project_created
from framework.auth.signals import user_confirmed
@@ -11,18 +13,36 @@
def subscribe_creator(resource):
if resource.is_collection or resource.is_deleted:
return None
- from website.notifications.utils import subscribe_user_to_notifications
- subscribe_user_to_notifications(resource, resource.creator)
+ user = resource.creator
+ if user.is_registered:
+ NotificationSubscription.objects.get_or_create(
+ user=user,
+ notification_type__name=NotificationType.Type.USER_FILE_UPDATED,
+ )
+ NotificationSubscription.objects.get_or_create(
+ user=user,
+ notification_type__name=NotificationType.Type.FILE_UPDATED,
+ object_id=resource.id,
+ content_type=ContentType.objects.get_for_model(resource)
+ )
@contributor_added.connect
def subscribe_contributor(resource, contributor, auth=None, *args, **kwargs):
- from website.notifications.utils import subscribe_user_to_notifications
from osf.models import Node
-
if isinstance(resource, Node):
if resource.is_collection or resource.is_deleted:
return None
- subscribe_user_to_notifications(resource, contributor)
+ if contributor.is_registered:
+ NotificationSubscription.objects.get_or_create(
+ user=contributor,
+ notification_type__name=NotificationType.Type.USER_FILE_UPDATED,
+ )
+ NotificationSubscription.objects.get_or_create(
+ user=contributor,
+ notification_type__name=NotificationType.Type.FILE_UPDATED,
+ object_id=resource.id,
+ content_type=ContentType.objects.get_for_model(resource)
+ )
@user_confirmed.connect
def subscribe_confirmed_user(user):
diff --git a/website/notifications/utils.py b/website/notifications/utils.py
index fc565610777..1cb8f485866 100644
--- a/website/notifications/utils.py
+++ b/website/notifications/utils.py
@@ -1,66 +1,13 @@
-import collections
-
from django.apps import apps
from django.contrib.contenttypes.models import ContentType
from framework.postcommit_tasks.handlers import run_postcommit
-from osf.models import NotificationSubscription, NotificationType
-from osf.utils.permissions import READ
-from website.notifications import constants
-from website.notifications.exceptions import InvalidSubscriptionError
+from osf.models import NotificationSubscription
from website.project import signals
from framework.celery_tasks import app
-class NotificationsDict(dict):
- def __init__(self):
- super().__init__()
- self.update(messages=[], children=collections.defaultdict(NotificationsDict))
-
- def add_message(self, keys, messages):
- """
- :param keys: ordered list of project ids from parent to node (e.g. ['parent._id', 'node._id'])
- :param messages: built email message for an event that occurred on the node
- :return: nested dict with project/component ids as the keys with the message at the appropriate level
- """
- d_to_use = self
-
- for key in keys:
- d_to_use = d_to_use['children'][key]
-
- if not isinstance(messages, list):
- messages = [messages]
-
- d_to_use['messages'].extend(messages)
-
-
-def find_subscription_type(subscription):
- """Find subscription type string within specific subscription.
- Essentially removes extraneous parts of the string to get the type.
- """
- subs_available = constants.USER_SUBSCRIPTIONS_AVAILABLE
- subs_available.extend(list({
- 'file_updated': 'Files updated'
- }.keys()))
- for available in subs_available:
- if available in subscription:
- return available
-
-
-def to_subscription_key(uid, event):
- """Build the Subscription primary key for the given guid and event"""
- return f'{uid}_{event}'
-
-
-def from_subscription_key(key):
- parsed_key = key.split('_', 1)
- return {
- 'uid': parsed_key[0],
- 'event': parsed_key[1]
- }
-
-
@signals.contributor_removed.connect
def remove_contributor_from_subscriptions(node, user):
""" Remove contributor from node subscriptions unless the user is an
@@ -85,7 +32,7 @@ def remove_contributor_from_subscriptions(node, user):
)
for subscription in node_subscriptions:
- subscription.remove_user_from_subscription()
+ subscription.delete()
@signals.node_deleted.connect
@@ -118,344 +65,3 @@ def remove_supplemental_node_from_preprints(node_id):
if preprint.node is not None:
preprint.node = None
preprint.save()
-
-
-def separate_users(node, user_ids):
- """Separates users into ones with permissions and ones without given a list.
- :param node: Node to separate based on permissions
- :param user_ids: List of ids, will also take and return User instances
- :return: list of subbed, list of removed user ids
- """
- OSFUser = apps.get_model('osf.OSFUser')
- removed = []
- subbed = []
- for user_id in user_ids:
- try:
- user = OSFUser.load(user_id)
- except TypeError:
- user = user_id
- if node.has_permission(user, READ):
- subbed.append(user_id)
- else:
- removed.append(user_id)
- return subbed, removed
-
-
-def users_to_remove(source_event, source_node, new_node):
- """Find users that do not have permissions on new_node.
- :param source_event: such as _file_updated
- :param source_node: Node instance where a subscription currently resides
- :param new_node: Node instance where a sub or new sub will be.
- :return: Dict of notification type lists with user_ids
- """
- removed_users = {key: [] for key in constants.NOTIFICATION_TYPES}
- if source_node == new_node:
- return removed_users
- sub = NotificationSubscription.objects.get(
- object_id=source_node.id,
- content_type=ContentType.objects.get_for_model(source_node),
- notification_type__name=source_event
- )
- for notification_type in constants.NOTIFICATION_TYPES:
- users = []
- if hasattr(sub, notification_type):
- users += list(getattr(sub, notification_type).values_list('guids___id', flat=True))
- return removed_users
-
-
-def move_subscription(remove_users, source_event, source_node, new_event, new_node):
- """Moves subscription from old_node to new_node
- :param remove_users: dictionary of lists of users to remove from the subscription
- :param source_event: A specific guid event _file_updated
- :param source_node: Instance of Node
- :param new_event: A specific guid event
- :param new_node: Instance of Node
- :return: Returns a NOTIFICATION_TYPES list of removed users without permissions
- """
- NotificationSubscription = apps.get_model('osf.NotificationSubscription')
- OSFUser = apps.get_model('osf.OSFUser')
- if source_node == new_node:
- return
- old_sub = NotificationSubscription.load(to_subscription_key(source_node._id, source_event))
- if not old_sub:
- return
- elif old_sub:
- old_sub._id = to_subscription_key(new_node._id, new_event)
- old_sub.event_name = new_event
- old_sub.owner = new_node
- new_sub = old_sub
- new_sub.save()
- # Remove users that don't have permission on the new node.
- for notification_type in constants.NOTIFICATION_TYPES:
- if new_sub:
- for user_id in remove_users[notification_type]:
- related_manager = getattr(new_sub, notification_type, None)
- subscriptions = related_manager.all() if related_manager else []
- if user_id in subscriptions:
- user = OSFUser.load(user_id)
- new_sub.remove_user_from_subscription(user)
-
-
-def get_configured_projects(user):
- """Filter all user subscriptions for ones that are on parent projects
- and return the node objects.
- :param user: OSFUser object
- :return: list of node objects for projects with no parent
- """
- configured_projects = set()
- user_subscriptions = NotificationSubscription.objects.filter(
- user=user
- )
-
- for subscription in user_subscriptions:
- # If the user has opted out of emails skip
- node = subscription.owner
-
- if (
- (subscription.none.filter(id=user.id).exists() and not node.parent_id) or
- node._id not in user.notifications_configured
- ):
- continue
-
- root = node.root
-
- if not root.is_deleted:
- configured_projects.add(root)
-
- return sorted(configured_projects, key=lambda n: n.title.lower())
-
-def check_project_subscriptions_are_all_none(user, node):
- node_subscriptions = NotificationSubscription.objects.filter(
- user=user,
- user__isnull=True,
- object_id=node.id,
- content_type=ContentType.objects.get_for_model(node)
- )
-
- for s in node_subscriptions:
- if not s.none.filter(id=user.id).exists():
- return False
- return True
-
-def format_data(user, nodes):
- """ Format subscriptions data for project settings page
- :param user: OSFUser object
- :param nodes: list of parent project node objects
- :return: treebeard-formatted data
- """
- items = []
-
- for node in nodes:
- assert node, f'{node._id} is not a valid Node.'
-
- can_read = node.has_permission(user, READ)
- can_read_children = node.has_permission_on_children(user, READ)
-
- if not can_read and not can_read_children:
- continue
-
- children = node.get_nodes(**{'is_deleted': False, 'is_node_link': False})
- children_tree = []
- # List project/node if user has at least READ permissions (contributor or admin viewer) or if
- # user is contributor on a component of the project/node
-
- if can_read:
- subscriptions = NotificationSubscription.objects.filter(
- user=user,
- notification_type__name='file_updated',
- user__isnull=True,
- object_id=node.id,
- content_type=ContentType.objects.get_for_model(node)
- )
-
- for subscription in subscriptions:
- children_tree.append(
- serialize_event(user, subscription=subscription, node=node)
- )
- for node_sub in subscriptions:
- children_tree.append(serialize_event(user, node=node, event_description=node_sub))
- children_tree.sort(key=lambda s: s['event']['title'])
-
- children_tree.extend(format_data(user, children))
-
- item = {
- 'node': {
- 'id': node._id,
- 'url': node.url if can_read else '',
- 'title': node.title if can_read else 'Private Project',
- },
- 'children': children_tree,
- 'kind': 'folder' if not node.parent_node or not node.parent_node.has_permission(user, READ) else 'node',
- 'nodeType': node.project_or_component,
- 'category': node.category,
- 'permissions': {
- 'view': can_read,
- },
- }
-
- items.append(item)
-
- return items
-
-
-def format_user_subscriptions(user):
- """ Format user-level subscriptions (e.g. comment replies across the OSF) for user settings page"""
- user_subs_available = constants.USER_SUBSCRIPTIONS_AVAILABLE
- subscriptions = [
- serialize_event(
- user, subscription,
- event_description=user_subs_available.pop(user_subs_available.index(getattr(subscription, 'event_name')))
- )
- for subscription in NotificationSubscription.objects.get(user=user)
- if subscription is not None and getattr(subscription, 'event_name') in user_subs_available
- ]
- subscriptions.extend([serialize_event(user, event_description=sub) for sub in user_subs_available])
- return subscriptions
-
-
-def format_file_subscription(user, node_id, path, provider):
- """Format a single file event"""
- AbstractNode = apps.get_model('osf.AbstractNode')
- node = AbstractNode.load(node_id)
- wb_path = path.lstrip('/')
- subscriptions = NotificationSubscription.objects.filter(
- user=user,
- user__isnull=True,
- object_id=node.id,
- content_type=ContentType.objects.get_for_model(node)
- )
-
- for subscription in subscriptions:
- if wb_path in getattr(subscription, 'event_name'):
- return serialize_event(user, subscription, node)
- return serialize_event(user, node=node, event_description='file_updated')
-
-
-all_subs = ['file_updated']
-all_subs += constants.USER_SUBSCRIPTIONS_AVAILABLE
-
-def serialize_event(user, subscription=None, node=None, event_description=None):
- """
- :param user: OSFUser object
- :param subscription: Subscription object, use if parsing particular subscription
- :param node: Node object, use if node is known
- :param event_description: use if specific subscription is known
- :return: treebeard-formatted subscription event
- """
- if not event_description:
- event_description = getattr(subscription, 'event_name')
- # Looks at only the types available. Deals with pre-pending file names.
- for sub_type in all_subs:
- if sub_type in event_description:
- event_type = sub_type
- else:
- event_type = event_description
- if node and node.parent_node:
- notification_type = 'adopt_parent'
- elif event_type.startswith('global_'):
- notification_type = 'email_transactional'
- else:
- notification_type = 'none'
- if subscription:
- for n_type in constants.NOTIFICATION_TYPES:
- if getattr(subscription, n_type).filter(id=user.id).exists():
- notification_type = n_type
- return {
- 'event': {
- 'title': event_description,
- 'description': all_subs[event_type],
- 'notificationType': notification_type,
- 'parent_notification_type': get_parent_notification_type(node, event_type, user)
- },
- 'kind': 'event',
- 'children': []
- }
-
-
-def get_parent_notification_type(node, event, user):
- """
- Given an event on a node (e.g. comment on node 'xyz'), find the user's notification
- type on the parent project for the same event.
- :param obj node: event owner (Node or User object)
- :param str event: notification event (e.g. 'comment_replies')
- :param obj user: OSFUser object
- :return: str notification type (e.g. 'email_transactional')
- """
- AbstractNode = apps.get_model('osf.AbstractNode')
-
- if node and isinstance(node, AbstractNode) and node.parent_node and node.parent_node.has_permission(user, READ):
- parent = node.parent_node
- key = to_subscription_key(parent._id, event)
- try:
- subscription = NotificationSubscription.objects.get(_id=key)
- except NotificationSubscription.DoesNotExist:
- return get_parent_notification_type(parent, event, user)
-
- for notification_type in constants.NOTIFICATION_TYPES:
- if getattr(subscription, notification_type).filter(id=user.id).exists():
- return notification_type
- else:
- return get_parent_notification_type(parent, event, user)
- else:
- return None
-
-
-def get_global_notification_type(global_subscription, user):
- """
- Given a global subscription (e.g. NotificationSubscription object with event_type
- 'global_file_updated'), find the user's notification type.
- :param obj global_subscription: NotificationSubscription object
- :param obj user: OSFUser object
- :return: str notification type (e.g. 'email_transactional')
- """
- for notification_type in constants.NOTIFICATION_TYPES:
- # TODO Optimize me
- if getattr(global_subscription, notification_type).filter(id=user.id).exists():
- return notification_type
-
-
-def subscribe_user_to_notifications(node, user):
- """ Update the notification settings for the creator or contributors
- :param user: User to subscribe to notifications
- """
-
- if getattr(node, 'is_registration', False):
- raise InvalidSubscriptionError('Registrations are invalid targets for subscriptions')
-
- if user.is_registered:
- NotificationSubscription.objects.get_or_create(
- user=user,
- notification_type__name=NotificationType.Type.USER_FILE_UPDATED,
- )
- NotificationSubscription.objects.get_or_create(
- user=user,
- notification_type__name=NotificationType.Type.FILE_UPDATED,
- object_id=node.id,
- content_type=ContentType.objects.get_for_model(node)
- )
-
-
-def format_user_and_project_subscriptions(user):
- """ Format subscriptions data for user settings page. """
- return [
- {
- 'node': {
- 'id': user._id,
- 'title': 'Default Notification Settings',
- 'help': 'These are default settings for new projects you create ' +
- 'or are added to. Modifying these settings will not ' +
- 'modify settings on existing projects.'
- },
- 'kind': 'heading',
- 'children': format_user_subscriptions(user)
- },
- {
- 'node': {
- 'id': '',
- 'title': 'Project Notifications',
- 'help': 'These are settings for each of your projects. Modifying ' +
- 'these settings will only modify the settings for the selected project.'
- },
- 'kind': 'heading',
- 'children': format_data(user, get_configured_projects(user))
- }]
diff --git a/website/notifications/views.py b/website/notifications/views.py
deleted file mode 100644
index 09fb59a1260..00000000000
--- a/website/notifications/views.py
+++ /dev/null
@@ -1,106 +0,0 @@
-from rest_framework import status as http_status
-
-from flask import request
-
-from framework import sentry
-from framework.auth.decorators import must_be_logged_in
-from framework.exceptions import HTTPError
-
-from osf.models import AbstractNode, Registration, NotificationSubscription
-from osf.utils.permissions import READ
-from website.notifications import utils
-from website.notifications.constants import NOTIFICATION_TYPES
-from website.project.decorators import must_be_valid_project
-
-
-@must_be_logged_in
-def get_subscriptions(auth):
- return utils.format_user_and_project_subscriptions(auth.user)
-
-
-@must_be_logged_in
-@must_be_valid_project
-def get_node_subscriptions(auth, **kwargs):
- node = kwargs.get('node') or kwargs['project']
- return utils.format_data(auth.user, [node])
-
-
-@must_be_logged_in
-def get_file_subscriptions(auth, **kwargs):
- node_id = request.args.get('node_id')
- path = request.args.get('path')
- provider = request.args.get('provider')
- return utils.format_file_subscription(auth.user, node_id, path, provider)
-
-
-@must_be_logged_in
-def configure_subscription(auth):
- user = auth.user
- json_data = request.get_json()
- target_id = json_data.get('id')
- event = json_data.get('event')
- notification_type = json_data.get('notification_type')
- path = json_data.get('path')
- provider = json_data.get('provider')
-
- if not event or (notification_type not in NOTIFICATION_TYPES and notification_type != 'adopt_parent'):
- raise HTTPError(http_status.HTTP_400_BAD_REQUEST, data=dict(
- message_long='Must provide an event and notification type for subscription.')
- )
-
- node = AbstractNode.load(target_id)
- if 'file_updated' in event and path is not None and provider is not None:
- wb_path = path.lstrip('/')
- event = wb_path + '_file_updated'
- event_id = utils.to_subscription_key(target_id, event)
-
- if not node:
- # if target_id is not a node it currently must be the current user
- if not target_id == user._id:
- sentry.log_message(
- '{!r} attempted to subscribe to either a bad '
- 'id or non-node non-self id, {}'.format(user, target_id)
- )
- raise HTTPError(http_status.HTTP_404_NOT_FOUND)
-
- if notification_type == 'adopt_parent':
- sentry.log_message(
- f'{user!r} attempted to adopt_parent of a none node id, {target_id}'
- )
- raise HTTPError(http_status.HTTP_400_BAD_REQUEST)
- else:
- if not node.has_permission(user, READ):
- sentry.log_message(f'{user!r} attempted to subscribe to private node, {target_id}')
- raise HTTPError(http_status.HTTP_403_FORBIDDEN)
-
- if isinstance(node, Registration):
- sentry.log_message(
- f'{user!r} attempted to subscribe to registration, {target_id}'
- )
- raise HTTPError(http_status.HTTP_400_BAD_REQUEST)
-
- if 'file_updated' in event and len(event) > len('file_updated'):
- pass
- else:
- parent = node.parent_node
- if not parent:
- sentry.log_message(
- '{!r} attempted to adopt_parent of '
- 'the parentless project, {!r}'.format(user, node)
- )
- raise HTTPError(http_status.HTTP_400_BAD_REQUEST)
-
- subscription, _ = NotificationSubscription.objects.get_or_create(
- user=user,
- subscribed_object=node,
- notification_type__name=event
- )
- subscription.save()
-
- if node and node._id not in user.notifications_configured:
- user.notifications_configured[node._id] = True
- user.save()
-
- subscription.save()
-
- return {'message': f'Successfully subscribed to {notification_type} list on {event_id}'}
diff --git a/website/routes.py b/website/routes.py
index 1d03f538c31..7d728ea866c 100644
--- a/website/routes.py
+++ b/website/routes.py
@@ -56,7 +56,6 @@
from website.registries import views as registries_views
from website.reviews import views as reviews_views
from website.institutions import views as institution_views
-from website.notifications import views as notification_views
from website.ember_osf_web import views as ember_osf_web_views
from website.closed_challenges import views as closed_challenges_views
from website.identifiers import views as identifier_views
@@ -1707,23 +1706,6 @@ def make_url_map(app):
json_renderer,
),
- Rule(
- '/subscriptions/',
- 'get',
- notification_views.get_subscriptions,
- json_renderer,
- ),
-
- Rule(
- [
- '/project//subscriptions/',
- '/project//node//subscriptions/'
- ],
- 'get',
- notification_views.get_node_subscriptions,
- json_renderer,
- ),
-
Rule(
[
'/project//tree/',
@@ -1733,14 +1715,6 @@ def make_url_map(app):
project_views.node.get_node_tree,
json_renderer,
),
-
- Rule(
- '/subscriptions/',
- 'post',
- notification_views.configure_subscription,
- json_renderer,
- ),
-
Rule(
[
'/project//settings/addons/',
From b0d052117bbd8695d50ca28df9fc0c369a76ba06 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Tue, 29 Jul 2025 16:12:21 -0400
Subject: [PATCH 141/336] fix addons logs
---
addons/base/views.py | 2 +-
website/notifications/events/files.py | 16 ++++++++++++----
2 files changed, 13 insertions(+), 5 deletions(-)
diff --git a/addons/base/views.py b/addons/base/views.py
index f91ae0ce2ce..af9fc52eeef 100644
--- a/addons/base/views.py
+++ b/addons/base/views.py
@@ -635,7 +635,7 @@ def create_waterbutler_log(payload, **kwargs):
user=user,
event_context={
'profile_image_url': user.profile_image_url(),
- 'localized_timestamp': timezone.now(),
+ 'localized_timestamp': str(timezone.now()),
'user_fullname': user.fullname,
'url': node.absolute_url,
}
diff --git a/website/notifications/events/files.py b/website/notifications/events/files.py
index d88cf3441e4..c067865c0d2 100644
--- a/website/notifications/events/files.py
+++ b/website/notifications/events/files.py
@@ -19,12 +19,20 @@
@signal.connect
-def file_updated(self, target=None, user=None, event_type=None, payload=None):
+def file_updated(self, target=None, user=None, payload=None):
+ notification_type = {
+ 'rename': NotificationType.Type.ADDON_FILE_RENAMED,
+ 'copy': NotificationType.Type.ADDON_FILE_COPIED,
+ 'create': NotificationType.Type.FILE_UPDATED,
+ 'move': NotificationType.Type.ADDON_FILE_MOVED,
+ 'delete': NotificationType.Type.FILE_REMOVED,
+ }[payload.get('action')]
if isinstance(target, Preprint):
return
- if event_type not in event_registry:
- raise NotImplementedError(f' {event_type} not in {event_registry}')
- event = event_registry[event_type](user, target, event_type, payload=payload)
+
+ if notification_type not in event_registry:
+ raise NotImplementedError(f' {notification_type} not in {event_registry}')
+ event = event_registry[notification_type](user, target, notification_type, payload=payload)
event.perform()
From d77eed27875e0852347138cf3df69fa9b350a204 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 11:16:42 -0400
Subject: [PATCH 142/336] clean up claim new user email throttle
---
osf/models/notification_subscription.py | 23 +++++++++-----
osf/models/notification_type.py | 34 ++++++--------------
tests/test_addons.py | 14 +++------
tests/test_claim_views.py | 16 ++++++----
website/notifications/events/files.py | 14 ++++++---
website/project/views/contributor.py | 42 +++++++++++++++----------
6 files changed, 75 insertions(+), 68 deletions(-)
diff --git a/osf/models/notification_subscription.py b/osf/models/notification_subscription.py
index 41b88ba9ea2..12e427b9e30 100644
--- a/osf/models/notification_subscription.py
+++ b/osf/models/notification_subscription.py
@@ -24,7 +24,6 @@ class NotificationSubscription(BaseModel):
max_length=500,
null=True
)
-
content_type = models.ForeignKey(ContentType, null=True, blank=True, on_delete=models.CASCADE)
object_id = models.CharField(max_length=255, null=True, blank=True)
subscribed_object = GenericForeignKey('content_type', 'object_id')
@@ -52,19 +51,29 @@ class Meta:
verbose_name = 'Notification Subscription'
verbose_name_plural = 'Notification Subscriptions'
- def emit(self, event_context=None):
+ def emit(
+ self,
+ event_context=None,
+ destination_address=None,
+ email_context=None,
+ ):
"""Emit a notification to a user by creating Notification and NotificationSubscription objects.
Args:
- user (OSFUser): The recipient of the notification.
- subscribed_object (optional): The object the subscription is related to.
- event_context (dict, optional): Context for rendering the notification template.
+ event_context (OSFUser): The info for context for the template
+ destination_address (optional): overides the user's email address for the notification. Good for sending
+ to a test address or OSF desk support'
+ email_context (dict, optional): Context for sending the email bcc, reply_to header etc
"""
if self.message_frequency == 'instantly':
- Notification.objects.create(
+ notification = Notification.objects.create(
subscription=self,
event_context=event_context
- ).send()
+ )
+ notification.send(
+ destination_address=destination_address,
+ email_context=email_context,
+ )
else:
Notification.objects.create(
subscription=self,
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 55fe70883df..6d9fe407d93 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -2,21 +2,8 @@
from django.contrib.postgres.fields import ArrayField
from django.contrib.contenttypes.models import ContentType
-from osf.models.notification import Notification
from enum import Enum
-
-class FrequencyChoices(Enum):
- NONE = 'none'
- INSTANTLY = 'instantly'
- DAILY = 'daily'
- WEEKLY = 'weekly'
- MONTHLY = 'monthly'
-
- @classmethod
- def choices(cls):
- return [(key.value, key.name.capitalize()) for key in cls]
-
def get_default_frequency_choices():
DEFAULT_FREQUENCY_CHOICES = ['none', 'instantly', 'daily', 'weekly', 'monthly']
return DEFAULT_FREQUENCY_CHOICES.copy()
@@ -225,18 +212,17 @@ def emit(
subscription, created = NotificationSubscription.objects.get_or_create(
notification_type=self,
user=user,
- content_type=ContentType.objects.get_for_model(subscribed_object) if subscribed_object else None,
- object_id=subscribed_object.pk if subscribed_object else None,
- defaults={'message_frequency': message_frequency},
+ defaults={
+ 'object_id': subscribed_object.pk if subscribed_object else None,
+ 'message_frequency': message_frequency,
+ 'content_type': ContentType.objects.get_for_model(subscribed_object) if subscribed_object else None,
+ },
+ )
+ subscription.emit(
+ destination_address=destination_address,
+ event_context=event_context,
+ email_context=email_context,
)
- if subscription.message_frequency == 'instantly':
- Notification.objects.create(
- subscription=subscription,
- event_context=event_context
- ).send(
- destination_address=destination_address,
- email_context=email_context
- )
def add_user_to_subscription(self, user, *args, **kwargs):
"""
diff --git a/tests/test_addons.py b/tests/test_addons.py
index f8421f2bd74..aaf4de9cc6c 100644
--- a/tests/test_addons.py
+++ b/tests/test_addons.py
@@ -350,18 +350,13 @@ def build_payload_with_dest(self, destination, **kwargs):
'signature': signature,
}
- @mock.patch('website.notifications.events.files.FileAdded.perform')
- def test_add_log(self, mock_perform):
- path = 'pizza'
+ def test_add_log(self):
url = self.node.api_url_for('create_waterbutler_log')
- payload = self.build_payload(metadata={'nid': self.node._id, 'path': path})
+ payload = self.build_payload(metadata={'nid': self.node._id, 'path': 'pizza'})
nlogs = self.node.logs.count()
self.app.put(url, json=payload)
self.node.reload()
assert self.node.logs.count() == nlogs + 1
- # # Mocking form_message and perform so that the payload need not be exact.
- # assert mock_form_message.called, "form_message not called"
- assert mock_perform.called, 'perform not called'
def test_add_log_missing_args(self):
path = 'pizza'
@@ -1542,13 +1537,14 @@ def test_resolve_folder_raise(self):
def test_delete_action_creates_trashed_file_node(self):
file_node = self.get_test_file()
payload = {
+ 'action': 'file_removed',
'provider': file_node.provider,
'metadata': {
'path': '/test/Test',
'materialized': '/test/Test'
}
}
- views.addon_delete_file_node(self=None, target=self.project, user=self.user, event_type='file_removed', payload=payload)
+ views.addon_delete_file_node(self=None, target=self.project, user=self.user, payload=payload)
assert not GithubFileNode.load(file_node._id)
assert TrashedFileNode.load(file_node._id)
@@ -1568,7 +1564,7 @@ def test_delete_action_for_folder_deletes_subfolders_and_creates_trashed_file_no
'materialized': '/test/'
}
}
- views.addon_delete_file_node(self=None, target=self.project, user=self.user, event_type='file_removed', payload=payload)
+ views.addon_delete_file_node(self=None, target=self.project, user=self.user, payload=payload)
assert not GithubFileNode.load(subfolder._id)
assert TrashedFileNode.load(file_node._id)
diff --git a/tests/test_claim_views.py b/tests/test_claim_views.py
index 025aa1a53eb..8d8986bbd10 100644
--- a/tests/test_claim_views.py
+++ b/tests/test_claim_views.py
@@ -228,13 +228,17 @@ def test_send_claim_registered_email_before_throttle_expires(self):
unclaimed_user=self.user,
node=self.project,
)
- # second call raises error because it was called before throttle period
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.USER_FORWARD_INVITE_REGISTERED
+ assert notifications[1]['type'] == NotificationType.Type.USER_PENDING_VERIFICATION_REGISTERED
+ # second call raises error because it was called before throttle period
+ with capture_notifications() as notifications:
with pytest.raises(HTTPError):
- send_claim_registered_email(
- claimer=reg_user,
- unclaimed_user=self.user,
- node=self.project,
- )
+ send_claim_registered_email(
+ claimer=reg_user,
+ unclaimed_user=self.user,
+ node=self.project,
+ )
assert not notifications
@mock.patch('website.project.views.contributor.send_claim_registered_email')
diff --git a/website/notifications/events/files.py b/website/notifications/events/files.py
index c067865c0d2..869a3d9c53d 100644
--- a/website/notifications/events/files.py
+++ b/website/notifications/events/files.py
@@ -20,20 +20,24 @@
@signal.connect
def file_updated(self, target=None, user=None, payload=None):
+ if isinstance(target, Preprint):
+ return
notification_type = {
'rename': NotificationType.Type.ADDON_FILE_RENAMED,
'copy': NotificationType.Type.ADDON_FILE_COPIED,
'create': NotificationType.Type.FILE_UPDATED,
'move': NotificationType.Type.ADDON_FILE_MOVED,
'delete': NotificationType.Type.FILE_REMOVED,
+ 'update': NotificationType.Type.FILE_UPDATED,
}[payload.get('action')]
- if isinstance(target, Preprint):
- return
-
if notification_type not in event_registry:
raise NotImplementedError(f' {notification_type} not in {event_registry}')
- event = event_registry[notification_type](user, target, notification_type, payload=payload)
- event.perform()
+ event_registry[notification_type](
+ user,
+ target,
+ notification_type,
+ payload=payload
+ ).perform()
class FileEvent(Event):
diff --git a/website/project/views/contributor.py b/website/project/views/contributor.py
index d0a217d1cc1..76b0dc938fb 100644
--- a/website/project/views/contributor.py
+++ b/website/project/views/contributor.py
@@ -210,14 +210,20 @@ def deserialize_contributors(node, user_dicts, auth, validate=False):
@unreg_contributor_added.connect
-def finalize_invitation(node, contributor, auth, email_template='default'):
+def finalize_invitation(node, contributor, auth, notification_type='default'):
try:
record = contributor.get_unclaimed_record(node._primary_key)
except ValueError:
pass
else:
if record['email']:
- send_claim_email(record['email'], contributor, node, notify=True, email_template=email_template)
+ send_claim_email(
+ record['email'],
+ contributor,
+ node,
+ notify=True,
+ notification_type=notification_type
+ )
@must_be_valid_project
@@ -404,8 +410,11 @@ def send_claim_registered_email(claimer, unclaimed_user, node, throttle=24 * 360
unclaimed_record = unclaimed_user.get_unclaimed_record(node._primary_key)
# check throttle
- timestamp = unclaimed_record.get('last_sent')
- if not throttle_period_expired(timestamp, throttle):
+ if check_email_throttle(
+ contributor=claimer,
+ notification_type=NotificationType.Type.USER_FORWARD_INVITE_REGISTERED,
+ throttle=throttle
+ ):
raise HTTPError(http_status.HTTP_400_BAD_REQUEST, data=dict(
message_long='User account can only be claimed with an existing user once every 24 hours'
))
@@ -552,13 +561,16 @@ def send_claim_email(
)
-def check_email_throttle(node, contributor, notification_type):
+def check_email_throttle(
+ contributor,
+ notification_type,
+ throttle=settings.CONTRIBUTOR_ADDED_EMAIL_THROTTLE
+):
"""
Check whether a 'contributor added' notification was sent recently
(within the throttle period) for the given node and contributor.
Args:
- node (AbstractNode): The node to check.
contributor (OSFUser): The contributor being notified.
notification_type (str, optional): What type of notification to check for.
@@ -566,25 +578,21 @@ def check_email_throttle(node, contributor, notification_type):
bool: True if throttled (email was sent recently), False otherwise.
"""
from osf.models import Notification, NotificationSubscription
- from website import settings
from datetime import timedelta
# Check for an active subscription for this contributor and this node
subscription, create = NotificationSubscription.objects.get_or_create(
user=contributor,
- notification_type__name=notification_type,
+ notification_type=NotificationType.objects.get(name=notification_type),
)
if create:
return False # No subscription means no previous notifications, so no throttling
# Check the most recent Notification for this subscription
- last_notification = Notification.objects.filter(
- subscription=subscription,
- sent__isnull=False
- ).order_by('-sent').first()
-
- if last_notification and last_notification.sent:
- cutoff_time = timezone.now() - timedelta(seconds=settings.CONTRIBUTOR_ADDED_EMAIL_THROTTLE)
- return last_notification.sent > cutoff_time
+ last_notification = Notification.objects.filter(subscription=subscription).last()
+ if last_notification:
+ cutoff_time = timezone.now() - timedelta(seconds=throttle)
+ if last_notification.sent:
+ return last_notification.sent > cutoff_time
return False # No previous sent notification, not throttled
@@ -625,7 +633,7 @@ def notify_added_contributor(node, contributor, notification_type, auth=None, *a
else:
raise NotImplementedError(f'notification_type: {notification_type} not implemented.')
- if check_email_throttle(node, contributor, notification_type):
+ if check_email_throttle(contributor, notification_type):
return
NotificationType.objects.get(name=notification_type).emit(
From 74f42bf0a9b76c4e5e29f7e41dabd5d72af43d22 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 13:46:01 -0400
Subject: [PATCH 143/336] clean-up contributor throttle tests
---
tests/test_adding_contributor_views.py | 46 ++++++++++++++++++--------
website/project/views/contributor.py | 24 +++-----------
2 files changed, 38 insertions(+), 32 deletions(-)
diff --git a/tests/test_adding_contributor_views.py b/tests/test_adding_contributor_views.py
index bb59a2eeef5..a2f6d31c33e 100644
--- a/tests/test_adding_contributor_views.py
+++ b/tests/test_adding_contributor_views.py
@@ -324,12 +324,22 @@ def test_notify_contributor_email_does_not_send_before_throttle_expires(self):
project = ProjectFactory()
auth = Auth(project.creator)
with capture_notifications() as notifications:
- notify_added_contributor(project, contributor, NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT, auth)
+ notify_added_contributor(
+ project,
+ contributor,
+ notification_type=NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT,
+ auth=auth
+ )
assert len(notifications) == 1
# 2nd call does not send email because throttle period has not expired
with capture_notifications() as notifications:
- notify_added_contributor(project, contributor, NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT, auth)
+ notify_added_contributor(
+ project,
+ contributor,
+ notification_type=NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT,
+ auth=auth
+ )
assert not notifications
def test_notify_contributor_email_sends_after_throttle_expires(self):
@@ -338,17 +348,27 @@ def test_notify_contributor_email_sends_after_throttle_expires(self):
contributor = UserFactory()
project = ProjectFactory()
auth = Auth(project.creator)
- with mock.patch.object(settings, 'CONTRIBUTOR_ADDED_EMAIL_THROTTLE', 1):
- with capture_notifications() as notifications:
- notify_added_contributor(project, contributor, NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT, auth, throttle=throttle)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
-
- time.sleep(settings.CONTRIBUTOR_ADDED_EMAIL_THROTTLE) # throttle period expires
- with capture_notifications() as notifications:
- notify_added_contributor(project, contributor, NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT, auth, throttle=throttle)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ with capture_notifications() as notifications:
+ notify_added_contributor(
+ project,
+ contributor,
+ NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT,
+ auth,
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+
+ time.sleep(2) # throttle period expires
+ with capture_notifications() as notifications:
+ notify_added_contributor(
+ project,
+ contributor,
+ NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT,
+ auth,
+ throttle=1
+ )
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
def test_add_contributor_to_fork_sends_email(self):
contributor = UserFactory()
diff --git a/website/project/views/contributor.py b/website/project/views/contributor.py
index 76b0dc938fb..d6f4072a5de 100644
--- a/website/project/views/contributor.py
+++ b/website/project/views/contributor.py
@@ -588,11 +588,11 @@ def check_email_throttle(
if create:
return False # No subscription means no previous notifications, so no throttling
# Check the most recent Notification for this subscription
- last_notification = Notification.objects.filter(subscription=subscription).last()
+ last_notification = Notification.objects.filter(subscription=subscription).order_by('created').last()
+
if last_notification:
cutoff_time = timezone.now() - timedelta(seconds=throttle)
- if last_notification.sent:
- return last_notification.sent > cutoff_time
+ return last_notification.created > cutoff_time
return False # No previous sent notification, not throttled
@@ -614,26 +614,12 @@ def notify_added_contributor(node, contributor, notification_type, auth=None, *a
if not notification_type:
return
- notification_type = notification_type or NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
logo = settings.OSF_LOGO
-
- if notification_type == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT:
- pass
- elif notification_type == NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT:
- pass
- elif notification_type == NotificationType.Type.DRAFT_REGISTRATION_CONTRIBUTOR_ADDED_DEFAULT:
- pass
- elif notification_type == NotificationType.Type.USER_CONTRIBUTOR_ADDED_ACCESS_REQUEST:
- pass
- elif notification_type == NotificationType.Type.NODE_INSTITUTIONAL_ACCESS_REQUEST:
- pass
- elif getattr(node, 'has_linked_published_preprints', None):
+ if getattr(node, 'has_linked_published_preprints', None):
notification_type = NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_PREPRINT_NODE_FROM_OSF
logo = settings.OSF_PREPRINTS_LOGO
- else:
- raise NotImplementedError(f'notification_type: {notification_type} not implemented.')
- if check_email_throttle(contributor, notification_type):
+ if check_email_throttle(contributor, notification_type, throttle=kwargs.get('throttle')):
return
NotificationType.objects.get(name=notification_type).emit(
From 6eacd4d498eb8a2d18d51b920b43ada392259cb0 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 13:51:44 -0400
Subject: [PATCH 144/336] fix reporter and preprint tests
---
notifications.yaml | 9 +++------
.../reporters/test_institutional_users_reporter.py | 2 ++
website/project/views/contributor.py | 3 ++-
3 files changed, 7 insertions(+), 7 deletions(-)
diff --git a/notifications.yaml b/notifications.yaml
index 62f636b8546..8537e269fa5 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -220,10 +220,6 @@ notification_types:
__docs__: ...
object_content_type_model_name: abstractprovider
template: 'website/templates/emails/contributor_added_preprints.html.mako'
- - name: provider_reviews_submission_confirmation
- __docs__: ...
- object_content_type_model_name: abstractprovider
- template: 'website/templates/emails/reviews_submission_confirmation.html.mako'
- name: provider_confirm_email_moderation
subject: 'OSF Account Verification, {provider.name}'
__docs__: ...
@@ -318,10 +314,11 @@ notification_types:
template: 'website/templates/emails/updates_rejected.html.mako'
#### PREPRINT
- - name: pending_retraction_admin
+ - name: preprint_contributor_added_preprint_node_from_osf
+ subject: 'You have been added as a contributor to an OSF project.'
__docs__: ...
object_content_type_model_name: preprint
- template: 'website/templates/emails/pending_retraction_admin.html.mako'
+ template: 'website/templates/emails/contributor_added_preprint_node_from_osf.html.mako'
- name: preprint_request_withdrawal_approved
__docs__: ...
object_content_type_model_name: preprint
diff --git a/osf_tests/metrics/reporters/test_institutional_users_reporter.py b/osf_tests/metrics/reporters/test_institutional_users_reporter.py
index 275fcb1e8a1..e399d848396 100644
--- a/osf_tests/metrics/reporters/test_institutional_users_reporter.py
+++ b/osf_tests/metrics/reporters/test_institutional_users_reporter.py
@@ -7,6 +7,7 @@
from api_tests.utils import create_test_file
from osf import models as osfdb
+from osf.management.commands.populate_notification_types import populate_notification_types
from osf.metrics.reports import InstitutionalUserReport
from osf.metrics.reporters import InstitutionalUsersReporter
from osf.metrics.utils import YearMonth
@@ -28,6 +29,7 @@ def _patch_now(fakenow: datetime.datetime):
class TestInstiUsersReporter(TestCase):
@classmethod
def setUpTestData(cls):
+ populate_notification_types()
cls._yearmonth = YearMonth(2012, 7)
cls._now = datetime.datetime(
cls._yearmonth.year,
diff --git a/website/project/views/contributor.py b/website/project/views/contributor.py
index d6f4072a5de..80ac1e2cb21 100644
--- a/website/project/views/contributor.py
+++ b/website/project/views/contributor.py
@@ -619,7 +619,8 @@ def notify_added_contributor(node, contributor, notification_type, auth=None, *a
notification_type = NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_PREPRINT_NODE_FROM_OSF
logo = settings.OSF_PREPRINTS_LOGO
- if check_email_throttle(contributor, notification_type, throttle=kwargs.get('throttle')):
+ throttle = kwargs.get('throttle', settings.CONTRIBUTOR_ADDED_EMAIL_THROTTLE)
+ if check_email_throttle(contributor, notification_type, throttle=throttle):
return
NotificationType.objects.get(name=notification_type).emit(
From 59c5c2f98793adf22d2a745c118501cb4dd2eb53 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 16:40:08 -0400
Subject: [PATCH 145/336] fix throttle
---
api_tests/users/views/test_user_claim.py | 13 +++++++--
website/project/views/contributor.py | 37 +++++++++++++-----------
2 files changed, 30 insertions(+), 20 deletions(-)
diff --git a/api_tests/users/views/test_user_claim.py b/api_tests/users/views/test_user_claim.py
index ddd7cfad4e5..01079aff6c9 100644
--- a/api_tests/users/views/test_user_claim.py
+++ b/api_tests/users/views/test_user_claim.py
@@ -1,5 +1,4 @@
import pytest
-from django.utils import timezone
from api.base.settings.defaults import API_BASE
from api.users.views import ClaimUser
@@ -217,8 +216,16 @@ def test_claim_auth_failure(self, app, url, claimer, wrong_preprint, project, un
assert res.status_code == 403
def test_claim_auth_throttle_error(self, app, url, claimer, unreg_user, project):
- unreg_user.unclaimed_records[project._id]['last_sent'] = timezone.now()
- unreg_user.save()
+ with capture_notifications() as notifications:
+ app.post_json_api(
+ url.format(unreg_user._id),
+ self.payload(id=project._id),
+ auth=claimer.auth,
+ expect_errors=True
+ )
+ assert len(notifications) == 2
+ assert notifications[0]['type'] == NotificationType.Type.USER_FORWARD_INVITE_REGISTERED
+ assert notifications[1]['type'] == NotificationType.Type.USER_PENDING_VERIFICATION_REGISTERED
with capture_notifications() as notifications:
res = app.post_json_api(
url.format(unreg_user._id),
diff --git a/website/project/views/contributor.py b/website/project/views/contributor.py
index 80ac1e2cb21..e86d4bcd7ca 100644
--- a/website/project/views/contributor.py
+++ b/website/project/views/contributor.py
@@ -408,16 +408,7 @@ def send_claim_registered_email(claimer, unclaimed_user, node, throttle=24 * 360
"""
unclaimed_record = unclaimed_user.get_unclaimed_record(node._primary_key)
-
# check throttle
- if check_email_throttle(
- contributor=claimer,
- notification_type=NotificationType.Type.USER_FORWARD_INVITE_REGISTERED,
- throttle=throttle
- ):
- raise HTTPError(http_status.HTTP_400_BAD_REQUEST, data=dict(
- message_long='User account can only be claimed with an existing user once every 24 hours'
- ))
# roll the valid token for each email, thus user cannot change email and approve a different email address
verification_key = generate_verification_key(verification_type='claim')
@@ -434,6 +425,17 @@ def send_claim_registered_email(claimer, unclaimed_user, node, throttle=24 * 360
token=unclaimed_record['token'],
_absolute=True,
)
+ if check_email_throttle(
+ referrer,
+ notification_type=NotificationType.Type.USER_FORWARD_INVITE_REGISTERED,
+ throttle=throttle
+ ):
+ raise HTTPError(
+ http_status.HTTP_400_BAD_REQUEST,
+ data=dict(
+ message_long='User account can only be claimed with an existing user once every 24 hours'
+ )
+ )
# Send mail to referrer, telling them to forward verification link to claimer
NotificationType.objects.get(
@@ -562,7 +564,7 @@ def send_claim_email(
def check_email_throttle(
- contributor,
+ user,
notification_type,
throttle=settings.CONTRIBUTOR_ADDED_EMAIL_THROTTLE
):
@@ -571,25 +573,24 @@ def check_email_throttle(
(within the throttle period) for the given node and contributor.
Args:
- contributor (OSFUser): The contributor being notified.
+ user (OSFUser): The contributor being notified.
notification_type (str, optional): What type of notification to check for.
Returns:
bool: True if throttled (email was sent recently), False otherwise.
"""
from osf.models import Notification, NotificationSubscription
-
from datetime import timedelta
# Check for an active subscription for this contributor and this node
- subscription, create = NotificationSubscription.objects.get_or_create(
- user=contributor,
+ subscription = NotificationSubscription.objects.filter(
+ user=user,
notification_type=NotificationType.objects.get(name=notification_type),
)
- if create:
+ if not subscription:
return False # No subscription means no previous notifications, so no throttling
# Check the most recent Notification for this subscription
+ subscription = subscription.get()
last_notification = Notification.objects.filter(subscription=subscription).order_by('created').last()
-
if last_notification:
cutoff_time = timezone.now() - timedelta(seconds=throttle)
return last_notification.created > cutoff_time
@@ -623,7 +624,9 @@ def notify_added_contributor(node, contributor, notification_type, auth=None, *a
if check_email_throttle(contributor, notification_type, throttle=throttle):
return
- NotificationType.objects.get(name=notification_type).emit(
+ NotificationType.objects.get(
+ name=notification_type
+ ).emit(
user=contributor,
event_context={
'user': contributor.id,
From 3c6dfa80b4b64df1518c72d38a49f6e401c3fae7 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 17:22:51 -0400
Subject: [PATCH 146/336] fix addons and campaign tests
---
framework/auth/campaigns.py | 28 +++++++++++++++++++---------
framework/auth/views.py | 3 ++-
osf/models/notification_type.py | 7 +++++++
tests/test_addons.py | 1 +
4 files changed, 29 insertions(+), 10 deletions(-)
diff --git a/framework/auth/campaigns.py b/framework/auth/campaigns.py
index 74445e6c259..6b484e9ae18 100644
--- a/framework/auth/campaigns.py
+++ b/framework/auth/campaigns.py
@@ -3,8 +3,8 @@
from django.utils import timezone
-from website import mails, settings
-from osf.models import PreprintProvider
+from website import settings
+from osf.models import PreprintProvider, NotificationType
from website.settings import DOMAIN, CAMPAIGN_REFRESH_THRESHOLD
from website.util.metrics import OsfSourceTags, OsfClaimedTags, CampaignSourceTags, CampaignClaimedTags, provider_source_tag
from framework.utils import throttle_period_expired
@@ -26,7 +26,7 @@ def get_campaigns():
'erpc': {
'system_tag': CampaignSourceTags.ErpChallenge.value,
'redirect_url': furl(DOMAIN).add(path='erpc/').url,
- 'confirmation_email_template': mails.CONFIRM_EMAIL_ERPC,
+ 'confirmation_email_template': NotificationType.Type.USER_CAMPAIGN_CONFIRM_EMAIL_ERPC,
'login_type': 'native',
},
}
@@ -44,12 +44,13 @@ def get_campaigns():
preprint_providers = PreprintProvider.objects.all()
for provider in preprint_providers:
if provider._id == 'osf':
- template = 'osf'
+ confirmation_email_template = NotificationType.Type.USER_CAMPAIGN_CONFIRM_PREPRINTS_OSF
name = 'OSF'
url_path = 'preprints/'
external_url = None
else:
- template = 'branded'
+ confirmation_email_template = NotificationType.Type.USER_CAMPAIGN_CONFIRM_PREPRINTS_BRANDED
+
name = provider.name
url_path = f'preprints/{provider._id}'
external_url = provider.domain
@@ -60,7 +61,7 @@ def get_campaigns():
'system_tag': system_tag,
'redirect_url': furl(DOMAIN).add(path=url_path).url,
'external_url': external_url,
- 'confirmation_email_template': mails.CONFIRM_EMAIL_PREPRINTS(template, name),
+ 'confirmation_email_template': confirmation_email_template,
'login_type': 'proxy',
'provider': name,
'logo': provider._id if name != 'OSF' else settings.OSF_PREPRINTS_LOGO,
@@ -73,7 +74,7 @@ def get_campaigns():
'osf-registries': {
'system_tag': provider_source_tag('osf', 'registry'),
'redirect_url': furl(DOMAIN).add(path='registries/').url,
- 'confirmation_email_template': mails.CONFIRM_EMAIL_REGISTRIES_OSF,
+ 'confirmation_email_template': None,
'login_type': 'proxy',
'provider': 'osf',
'logo': settings.OSF_REGISTRIES_LOGO
@@ -84,18 +85,27 @@ def get_campaigns():
'osf-registered-reports': {
'system_tag': CampaignSourceTags.OsfRegisteredReports.value,
'redirect_url': furl(DOMAIN).add(path='rr/').url,
- 'confirmation_email_template': mails.CONFIRM_EMAIL_REGISTRIES_OSF,
+ 'confirmation_email_template': NotificationType.Type.USER_CAMPAIGN_CONFIRM_EMAIL_REGISTRIES_OSF,
'login_type': 'proxy',
'provider': 'osf',
'logo': settings.OSF_REGISTRIES_LOGO
}
})
+ newest_campaigns.update({
+ 'agu_conference_2023': {
+ 'system_tag': CampaignSourceTags.AguConference2023.value,
+ 'redirect_url': furl(DOMAIN).add(path='dashboard/').url,
+ 'confirmation_email_template': NotificationType.Type.USER_CAMPAIGN_CONFIRM_EMAIL_AGU_CONFERENCE_2023,
+ 'login_type': 'native',
+ }
+ })
+
newest_campaigns.update({
'agu_conference': {
'system_tag': CampaignSourceTags.AguConference.value,
'redirect_url': furl(DOMAIN).add(path='dashboard/').url,
- 'confirmation_email_template': mails.CONFIRM_EMAIL_AGU_CONFERENCE,
+ 'confirmation_email_template': NotificationType.Type.USER_CAMPAIGN_CONFIRM_EMAIL_AGU_CONFERENCE,
'login_type': 'native',
}
})
diff --git a/framework/auth/views.py b/framework/auth/views.py
index 81b362532e9..35e913949c9 100644
--- a/framework/auth/views.py
+++ b/framework/auth/views.py
@@ -844,6 +844,8 @@ def send_confirm_email(user, email, renew=False, external_id_provider=None, exte
notification_type = NotificationType.Type.USER_EXTERNAL_LOGIN_CONFIRM_EMAIL_CREATE
elif user.external_identity[external_id_provider][external_id] == 'LINK':
notification_type = NotificationType.Type.USER_EXTERNAL_LOGIN_CONFIRM_EMAIL_LINK
+ else:
+ raise HTTPError(http_status.HTTP_400_BAD_REQUEST, data={})
elif merge_target:
# Merge account confirmation
notification_type = NotificationType.Type.USER_CONFIRM_MERGE
@@ -857,7 +859,6 @@ def send_confirm_email(user, email, renew=False, external_id_provider=None, exte
# Account creation confirmation: from OSF
notification_type = NotificationType.Type.USER_INITIAL_CONFIRM_EMAIL
- print(notification_type)
NotificationType.objects.get(name=notification_type).emit(
user=user,
event_context={
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 6d9fe407d93..134809c63b4 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -67,6 +67,13 @@ class Type(str, Enum):
USER_CONTRIBUTOR_ADDED_ACCESS_REQUEST = 'user_contributor_added_access_request'
USER_ARCHIVE_JOB_UNCAUGHT_ERROR = 'user_archive_job_uncaught_error'
+ USER_CAMPAIGN_CONFIRM_PREPRINTS_BRANDED = 'user_campaign_confirm_preprint_branded'
+ USER_CAMPAIGN_CONFIRM_PREPRINTS_OSF = 'user_campaign_confirm_preprint_osf'
+ USER_CAMPAIGN_CONFIRM_EMAIL_AGU_CONFERENCE = 'user_campaign_confirm_email_agu_conference'
+ USER_CAMPAIGN_CONFIRM_EMAIL_AGU_CONFERENCE_2023 = 'user_campaign_confirm_email_agu_conference_2023'
+ USER_CAMPAIGN_CONFIRM_EMAIL_REGISTRIES_OSF = 'user_campaign_confirm_email_registries_osf'
+ USER_CAMPAIGN_CONFIRM_EMAIL_ERPC = 'user_campaign_confirm_email_erpc'
+
# Node notifications
NODE_COMMENT = 'node_comments'
NODE_FILES_UPDATED = 'node_files_updated'
diff --git a/tests/test_addons.py b/tests/test_addons.py
index aaf4de9cc6c..5ba35b6c760 100644
--- a/tests/test_addons.py
+++ b/tests/test_addons.py
@@ -1558,6 +1558,7 @@ def test_delete_action_for_folder_deletes_subfolders_and_creates_trashed_file_no
)
subfolder.save()
payload = {
+ 'action': 'file_removed',
'provider': file_node.provider,
'metadata': {
'path': '/test/',
From 24c503b607cafac160999d43804ba8c85f6e9d29 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 18:02:20 -0400
Subject: [PATCH 147/336] fix schema response tests
---
osf_tests/test_schema_responses.py | 32 ++++++++++++++++++------------
osf_tests/utils.py | 18 -----------------
tests/test_spam_mixin.py | 4 ++--
3 files changed, 21 insertions(+), 33 deletions(-)
diff --git a/osf_tests/test_schema_responses.py b/osf_tests/test_schema_responses.py
index f3f831224c6..c924aebcd17 100644
--- a/osf_tests/test_schema_responses.py
+++ b/osf_tests/test_schema_responses.py
@@ -254,10 +254,13 @@ def test_create_from_previous_response_notification(
with capture_notifications() as notifications:
schema_response.SchemaResponse.create_from_previous_response(
- previous_response=initial_response, initiator=admin_user
+ previous_response=initial_response,
+ initiator=admin_user
)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert len(notifications) == len(notification_recipients)
+ assert all(notification['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_INITIATED
+ for notification in notifications)
+ assert all(notification['kwargs']['user'].username in notification_recipients for notification in notifications)
@pytest.mark.parametrize(
'invalid_response_state',
@@ -580,8 +583,8 @@ def test_submit_response_notification(
with capture_notifications() as notifications:
revised_response.submit(user=admin_user, required_approvers=[admin_user])
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert len(notifications) == 3
+ assert any(notification['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_SUBMITTED for notification in notifications)
def test_no_submit_notification_on_initial_response(self, initial_response, admin_user):
initial_response.approvals_state_machine.set_state(ApprovalStates.IN_PROGRESS)
@@ -681,8 +684,8 @@ def test_approve_response_notification(
assert not notifications # Should only send email on final approval
with capture_notifications() as notifications:
revised_response.approve(user=alternate_user)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert len(notifications) == 3
+ assert all(notification['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED for notification in notifications)
def test_no_approve_notification_on_initial_response(self, initial_response, admin_user):
initial_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
@@ -749,8 +752,9 @@ def test_reject_response_notification(
with capture_notifications() as notifications:
revised_response.reject(user=admin_user)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert len(notifications) == 3
+ assert all(notification['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_REJECTED
+ for notification in notifications)
def test_no_reject_notification_on_initial_response(self, initial_response, admin_user):
initial_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
@@ -909,8 +913,9 @@ def test_moderator_accept_notification(
with capture_notifications() as notifications:
revised_response.accept(user=moderator)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert len(notifications) == 3
+ assert all(notification['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_INITIATED
+ for notification in notifications)
def test_no_moderator_accept_notification_on_initial_response(
self, initial_response, moderator):
@@ -949,8 +954,9 @@ def test_moderator_reject_notification(
with capture_notifications() as notifications:
revised_response.reject(user=moderator)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert len(notifications) == 3
+ assert all(notification['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_REJECTED
+ for notification in notifications)
def test_no_moderator_reject_notification_on_initial_response(
self, initial_response, moderator):
diff --git a/osf_tests/utils.py b/osf_tests/utils.py
index 884c4249de9..adb00482168 100644
--- a/osf_tests/utils.py
+++ b/osf_tests/utils.py
@@ -218,21 +218,3 @@ def get_default_test_schema():
create_schema_blocks_for_atomic_schema(test_schema)
return test_schema
-
-def assert_notification_correctness(send_mail_mock, expected_template, expected_recipients):
- '''Confirms that a mocked send_mail function contains the appropriate calls.'''
- assert send_mail_mock.call_count == len(expected_recipients)
-
- recipients = set()
- templates = set()
- for _, call_kwargs in send_mail_mock.call_args_list:
- recipients.add(call_kwargs['to_addr'])
- templates.add(call_kwargs['mail'])
-
- assert recipients == expected_recipients
-
- try:
- assert templates == {expected_template}
- except AssertionError: # the non-static subject attributes mean we need a different comparison
- assert {template.tpl_prefix for template in list(templates)} == {expected_template.tpl_prefix}
- assert {template._subject for template in list(templates)} == {expected_template._subject}
diff --git a/tests/test_spam_mixin.py b/tests/test_spam_mixin.py
index af509272425..59b04ec1fa9 100644
--- a/tests/test_spam_mixin.py
+++ b/tests/test_spam_mixin.py
@@ -26,8 +26,8 @@ def test_throttled_autoban():
proj.flag_spam()
proj.save()
projects.append(proj)
- assert len(notifications) == 7
- assert notifications[0]['type'] == NotificationType.Type.USER_CONFIRM_EMAIL
+ assert len(notifications) == 1
+ assert notifications[0]['type'] == NotificationType.Type.USER_SPAM_BANNED
user.reload()
assert user.is_disabled
for project in projects:
From 94734fdf02848b4749e21c784dc77a4e785e475e Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 18:50:38 -0400
Subject: [PATCH 148/336] clean-up sanction code
---
notifications.yaml | 13 +++++++
osf/models/notification_type.py | 1 +
osf/models/sanctions.py | 57 +++++++++---------------------
website/notifications/listeners.py | 22 ++++++++++++
4 files changed, 52 insertions(+), 41 deletions(-)
diff --git a/notifications.yaml b/notifications.yaml
index 8537e269fa5..d9a8d173e88 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -206,6 +206,11 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/archive_uncaught_error_user.html.mako'
+ - name: user_new_public_project
+ subject: 'Problem Registering'
+ __docs__: ...
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/archive_uncaught_error_user.html.mako'
#### PROVIDER
- name: provider_new_pending_submissions
@@ -271,6 +276,14 @@ notification_types:
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/pending_embargo_non_admin.html.mako'
+ - name: node_pending_embargo_termination_admin
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/pending_embargo_termination_admin.html.mako'
+ - name: node_pending_embargo_termination_non_admin
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/pending_embargo_termination_non_admin.html.mako'
- name: node_affiliation_changed
__docs__: ...
object_content_type_model_name: abstractnode
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 134809c63b4..08435b7441e 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -66,6 +66,7 @@ class Type(str, Enum):
USER_CONTRIBUTOR_ADDED_PREPRINT_NODE_FROM_OSF = 'user_contributor_added_preprint_node_from_osf'
USER_CONTRIBUTOR_ADDED_ACCESS_REQUEST = 'user_contributor_added_access_request'
USER_ARCHIVE_JOB_UNCAUGHT_ERROR = 'user_archive_job_uncaught_error'
+ USER_NEW_PUBLIC_PROJECT = 'user_new_public_project'
USER_CAMPAIGN_CONFIRM_PREPRINTS_BRANDED = 'user_campaign_confirm_preprint_branded'
USER_CAMPAIGN_CONFIRM_PREPRINTS_OSF = 'user_campaign_confirm_preprint_osf'
diff --git a/osf/models/sanctions.py b/osf/models/sanctions.py
index a5b19f3a917..c3e76a5dddf 100644
--- a/osf/models/sanctions.py
+++ b/osf/models/sanctions.py
@@ -345,8 +345,6 @@ class Meta:
class EmailApprovableSanction(TokenApprovableSanction):
- AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = None
- NON_AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = None
VIEW_URL_TEMPLATE = ''
APPROVE_URL_TEMPLATE = ''
@@ -375,12 +373,6 @@ def _format_or_empty(template, context):
return template.format(**context)
return ''
- def _get_authoriser_notification_type(self):
- return None
-
- def _get_non_authoriser_notification_type(self):
- return None
-
def _view_url(self, user_id, node):
return self._format_or_empty(self.VIEW_URL_TEMPLATE,
self._view_url_context(user_id, node))
@@ -414,22 +406,20 @@ def _email_template_context(self, user, node, is_authorizer=False):
return {}
def _notify_authorizer(self, authorizer, node):
- if notification_type := self._get_authoriser_notification_type():
- notification_type.emit(
+ return NotificationType.objects.get(name=self.AUTHORIZER_NOTIFY_EMAIL_TYPE).emit(
+ user=authorizer,
+ event_context=self._email_template_context(
authorizer,
- event_context=self._email_template_context(
- authorizer,
- node,
- is_authorizer=True
- )
+ node,
+ is_authorizer=True
)
+ )
def _notify_non_authorizer(self, user, node):
- if notification_type := self._get_authoriser_notification_type():
- notification_type.emit(
- user,
- event_context=self._email_template_context(user, node)
- )
+ return NotificationType.objects.get(name=self.NON_AUTHORIZER_NOTIFY_EMAIL_TYPE).emit(
+ user=user,
+ event_context=self._email_template_context(user, node)
+ )
def ask(self, group):
"""
@@ -478,8 +468,8 @@ class Embargo(SanctionCallbackMixin, EmailApprovableSanction):
DISPLAY_NAME = 'Embargo'
SHORT_NAME = 'embargo'
- AUTHORIZER_NOTIFY_EMAIL_TYPE = 'node_embargo_admin'
- NON_AUTHORIZER_NOTIFY_EMAIL_TYPE = 'node_embargo_non_admin'
+ AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_EMBARGO_ADMIN
+ NON_AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_EMBARGO_NON_ADMIN
VIEW_URL_TEMPLATE = VIEW_PROJECT_URL_TEMPLATE
APPROVE_URL_TEMPLATE = osf_settings.DOMAIN + 'token_action/{node_id}/?token={token}'
@@ -513,12 +503,6 @@ def embargo_end_date(self):
def pending_registration(self):
return not self.for_existing_registration and self.is_pending_approval
- def _get_authoriser_notification_type(self):
- return NotificationType.objects.get(name=self.AUTHORIZER_NOTIFY_EMAIL_TYPE)
-
- def _get_non_authoriser_notification_type(self):
- return NotificationType.objects.get(name=self.NON_AUTHORIZER_NOTIFY_EMAIL_TYPE)
-
def _get_registration(self):
return self.registrations.first()
@@ -785,11 +769,8 @@ class RegistrationApproval(SanctionCallbackMixin, EmailApprovableSanction):
DISPLAY_NAME = 'Approval'
SHORT_NAME = 'registration_approval'
- AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = NotificationType.Type.NODE_PENDING_REGISTRATION_ADMIN
- NON_AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = NotificationType.Type.NODE_PENDING_REGISTRATION_NON_ADMIN
-
- AUTHORIZER_NOTIFY_EMAIL_TYPE = 'node_pending_registration_admin'
- NON_AUTHORIZER_NOTIFY_EMAIL_TYPE = 'node_pending_registration_non_admin'
+ AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_REGISTRATION_ADMIN
+ NON_AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_REGISTRATION_NON_ADMIN
VIEW_URL_TEMPLATE = VIEW_PROJECT_URL_TEMPLATE
APPROVE_URL_TEMPLATE = osf_settings.DOMAIN + 'token_action/{node_id}/?token={token}'
@@ -809,12 +790,6 @@ def find_approval_backlog():
guid=models.F('_id')
).order_by('-initiation_date')
- def _get_authoriser_notification_type(self):
- return NotificationType.objects.get(name=self.AUTHORIZER_NOTIFY_EMAIL_TYPE)
-
- def _get_non_authoriser_notification_type(self):
- return NotificationType.objects.get(name=self.NON_AUTHORIZER_NOTIFY_EMAIL_TYPE)
-
def _get_registration(self):
return self.registrations.first()
@@ -961,8 +936,8 @@ class EmbargoTerminationApproval(EmailApprovableSanction):
DISPLAY_NAME = 'Embargo Termination Request'
SHORT_NAME = 'embargo_termination_approval'
- AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = NotificationType.Type.NODE_PENDING_EMBARGO_TERMINATION_ADMIN
- NON_AUTHORIZER_NOTIFY_EMAIL_TEMPLATE = NotificationType.Type.NODE_PENDING_EMBARGO_TERMINATION_NON_ADMIN
+ AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_EMBARGO_TERMINATION_ADMIN
+ NON_AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_EMBARGO_TERMINATION_NON_ADMIN
VIEW_URL_TEMPLATE = VIEW_PROJECT_URL_TEMPLATE
APPROVE_URL_TEMPLATE = osf_settings.DOMAIN + 'token_action/{node_id}/?token={token}'
diff --git a/website/notifications/listeners.py b/website/notifications/listeners.py
index 2ed837308bb..1395b606592 100644
--- a/website/notifications/listeners.py
+++ b/website/notifications/listeners.py
@@ -6,6 +6,8 @@
from osf.models import NotificationSubscription, NotificationType
from website.project.signals import contributor_added, project_created
from framework.auth.signals import user_confirmed
+from website.project.signals import privacy_set_public
+from website import settings
logger = logging.getLogger(__name__)
@@ -56,3 +58,23 @@ def subscribe_confirmed_user(user):
user=user,
notification_type=NotificationType.objects.get(name=NotificationType.Type.USER_REVIEWS)
)
+
+
+@privacy_set_public.connect
+def queue_first_public_project_email(user, node):
+ """Queue and email after user has made their first
+ non-OSF4M project public.
+ """
+ NotificationType.objects.get(
+ name=NotificationType.Type.USER_NEW_PUBLIC_PROJECT,
+ ).emit(
+ user=user,
+ event_context={
+ 'node': node,
+ 'user': user,
+ 'nid': node._id,
+ 'fullname': user.fullname,
+ 'project_title': node.title,
+ 'osf_support_email': settings.OSF_SUPPORT_EMAIL,
+ }
+ )
From 51df115072b15ea9e202619b215cb448cbd033d6 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 18:56:14 -0400
Subject: [PATCH 149/336] remove mails and copy over subject lines
---
notifications.yaml | 7 ++++++-
osf/models/node.py | 2 +-
osf/models/sanctions.py | 3 +++
tests/test_preprints.py | 2 +-
tests/test_resend_confirmation.py | 2 +-
website/mails/mails.py | 24 ------------------------
website/notifications/listeners.py | 4 +---
7 files changed, 13 insertions(+), 31 deletions(-)
diff --git a/notifications.yaml b/notifications.yaml
index d9a8d173e88..952dd62be67 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -34,6 +34,7 @@ notification_types:
object_content_type_model_name: osfuser
template: 'website/templates/emails/contributor_added_draft_registration.html.mako'
- name: user_contributor_added_osf_preprint
+ subject: 'You have been added as a contributor to an OSF preprint.'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/contributor_added_preprint_node_from_osf.html.mako'
@@ -67,6 +68,7 @@ notification_types:
object_content_type_model_name: osfuser
template: 'website/templates/emails/institution_deactivation.html.mako'
- name: user_invite_preprints_osf
+ subject: 'You have been added as a contributor to an OSF preprint.'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/invite_preprints_osf.html.mako'
@@ -74,11 +76,13 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/invite_preprints.html.mako'
- - name: invite_draft_registration
+ - name: user_invite_draft_registration
+ subject: 'You have a new registration draft'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/invite_draft_registration.html.mako'
- name: user_invite_default
+ subject: 'You have been added as a contributor to an OSF project.'
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/invite_default.html.mako'
@@ -257,6 +261,7 @@ notification_types:
object_content_type_model_name: abstractnode
template: 'website/templates/emails/node_request_institutional_access_request.html.mako'
- name: node_contributor_added_default
+ subject: 'You have been added as a contributor to an OSF project.'
__docs__: This email notifies the user that they have been added as a contributor to a node.
object_content_type_model_name: abstractnode
template: 'website/templates/emails/contributor_added_default.html.mako'
diff --git a/osf/models/node.py b/osf/models/node.py
index d6b35c81335..6ee777037da 100644
--- a/osf/models/node.py
+++ b/osf/models/node.py
@@ -1245,7 +1245,7 @@ def set_privacy(self, permissions, auth=None, log=True, save=True, meeting_creat
if save:
self.save()
if auth and permissions == 'public':
- project_signals.privacy_set_public.send(auth.user, node=self, meeting_creation=meeting_creation)
+ project_signals.privacy_set_public.send(auth.user, node=self)
return True
@property
diff --git a/osf/models/sanctions.py b/osf/models/sanctions.py
index c3e76a5dddf..9f072eaaeb4 100644
--- a/osf/models/sanctions.py
+++ b/osf/models/sanctions.py
@@ -650,6 +650,9 @@ class Retraction(EmailApprovableSanction):
DISPLAY_NAME = 'Retraction'
SHORT_NAME = 'retraction'
+ AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_REGISTRATION_ADMIN
+ NON_AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_REGISTRATION_NON_ADMIN
+
VIEW_URL_TEMPLATE = VIEW_PROJECT_URL_TEMPLATE
APPROVE_URL_TEMPLATE = osf_settings.DOMAIN + 'token_action/{node_id}/?token={token}'
REJECT_URL_TEMPLATE = osf_settings.DOMAIN + 'token_action/{node_id}/?token={token}'
diff --git a/tests/test_preprints.py b/tests/test_preprints.py
index 6f1eda5876b..b3c97ece060 100644
--- a/tests/test_preprints.py
+++ b/tests/test_preprints.py
@@ -999,7 +999,7 @@ def test_check_spam_on_private_preprint_bans_new_spam_user(self, preprint, user)
@mock.patch('website.mailchimp_utils.unsubscribe_mailchimp')
@mock.patch.object(settings, 'SPAM_SERVICES_ENABLED', True)
@mock.patch.object(settings, 'SPAM_ACCOUNT_SUSPENSION_ENABLED', True)
- def test_check_spam_on_private_preprint_does_not_ban_existing_user(self, preprint, user):
+ def test_check_spam_on_private_preprint_does_not_ban_existing_user(self, mock_mailchimp, preprint, user):
preprint.is_public = False
preprint.save()
with mock.patch('osf.models.Preprint._get_spam_content', mock.Mock(return_value='some content!')):
diff --git a/tests/test_resend_confirmation.py b/tests/test_resend_confirmation.py
index 95609e5ad76..53fd2ba25f2 100644
--- a/tests/test_resend_confirmation.py
+++ b/tests/test_resend_confirmation.py
@@ -63,7 +63,7 @@ def test_cannot_receive_resend_confirmation_email_2(self):
with capture_notifications() as notifications:
res = form.submit(self.app)
# check email, request and response
- assert notifications
+ assert not notifications
assert res.status_code == 200
assert res.request.path == self.post_url
assert_in_html('If there is an OSF account', res.text)
diff --git a/website/mails/mails.py b/website/mails/mails.py
index db684f7e84f..4d47b52b5cb 100644
--- a/website/mails/mails.py
+++ b/website/mails/mails.py
@@ -100,38 +100,14 @@ def get_english_article(word):
# Contributor added confirmation emails
-INVITE_DEFAULT = Mail(
- 'invite_default',
- subject='You have been added as a contributor to an OSF project.'
-)
-INVITE_OSF_PREPRINT = Mail(
- 'invite_preprints_osf',
- subject='You have been added as a contributor to an OSF preprint.'
-)
INVITE_PREPRINT = lambda provider: Mail(
'invite_preprints',
subject=f'You have been added as a contributor to {get_english_article(provider.name)} {provider.name} {provider.preprint_word}.'
)
-INVITE_DRAFT_REGISTRATION = Mail(
- 'invite_draft_registration',
- subject='You have a new registration draft'
-)
-CONTRIBUTOR_ADDED_DEFAULT = Mail(
- 'contributor_added_default',
- subject='You have been added as a contributor to an OSF project.'
-)
-CONTRIBUTOR_ADDED_OSF_PREPRINT = Mail(
- 'contributor_added_preprints_osf',
- subject='You have been added as a contributor to an OSF preprint.'
-)
CONTRIBUTOR_ADDED_PREPRINT = lambda provider: Mail(
'contributor_added_preprints',
subject=f'You have been added as a contributor to {get_english_article(provider.name)} {provider.name} {provider.preprint_word}.'
)
-CONTRIBUTOR_ADDED_PREPRINT_NODE_FROM_OSF = Mail(
- 'contributor_added_preprint_node_from_osf',
- subject='You have been added as a contributor to an OSF project.'
-)
MODERATOR_ADDED = lambda provider: Mail(
'moderator_added',
subject=f'You have been added as a moderator for {provider.name}'
diff --git a/website/notifications/listeners.py b/website/notifications/listeners.py
index 1395b606592..871d6d56792 100644
--- a/website/notifications/listeners.py
+++ b/website/notifications/listeners.py
@@ -70,11 +70,9 @@ def queue_first_public_project_email(user, node):
).emit(
user=user,
event_context={
- 'node': node,
- 'user': user,
'nid': node._id,
'fullname': user.fullname,
'project_title': node.title,
- 'osf_support_email': settings.OSF_SUPPORT_EMAIL,
+ 'osf_url': settings.DOMAIN,
}
)
From 5ec04a074cba829bd9d7a2bb6360ed0ca9c7cc3b Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 19:56:02 -0400
Subject: [PATCH 150/336] remove old comment based notification tests
---
notifications.yaml | 82 ++-
osf/models/notification_type.py | 1 +
osf/models/sanctions.py | 34 +-
osf_tests/test_comment.py | 685 +------------------
osf_tests/test_node.py | 4 +-
tests/test_preprints.py | 2 +-
tests/test_registrations/test_retractions.py | 4 +-
website/mails/mails.py | 4 -
8 files changed, 97 insertions(+), 719 deletions(-)
diff --git a/notifications.yaml b/notifications.yaml
index 952dd62be67..0a8b559f10f 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -168,10 +168,6 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/external_confirm_create.html.mako'
- - name: user_primary_email_changed
- __docs__: ...
- object_content_type_model_name: osfuser
- template: 'website/templates/emails/primary_email_changed.html.mako'
- name: user_spam_banned
__docs__: ...
object_content_type_model_name: osfuser
@@ -215,6 +211,42 @@ notification_types:
__docs__: ...
object_content_type_model_name: osfuser
template: 'website/templates/emails/archive_uncaught_error_user.html.mako'
+ - name: user_confirm_email_erpc
+ subject: 'OSF Account Verification, Election Research Preacceptance Competition'
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/confirm_erpc.html.mako'
+ - name: user_confirm_email_agu_conference
+ subject: 'OSF Account Verification, from the American Geophysical Union Conference'
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/confirm_agu_conference.html.mako'
+ - name: user_confirm_email_registries_osf
+ subject: 'OSF Account Verification, OSF Registries'
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/confirm_registries_osf.html.mako'
+ - name: user_confirm_merge
+ subject: 'Confirm account merge'
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/confirm_merge.html.mako'
+ - name: user_primary_email_changed
+ subject: 'Primary email changed'
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/primary_email_changed.html.mako'
+ - name: user_spam_files_detected
+ subject: '[auto] Spam files audit'
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/spam_files_detected.html.mako'
+ - name: user_crossref_doi_pending
+ subject: 'There are ${pending_doi_count} preprints with crossref DOI pending.'
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/crossref_doi_pending.html.mako'
+ - name: user_terms_of_use_updated
+ subject: 'Updated Terms of Use for COS Websites and Services'
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/tou_notif.html.mako'
+ - name: user_registration_bulk_upload_product_owner
+ subject: 'Registry Could Not Bulk Upload Registrations'
+ object_content_type_model_name: osfuser
+ template: 'website/templates/emails/registration_bulk_upload_product_owner.html.mako'
#### PROVIDER
- name: provider_new_pending_submissions
@@ -273,11 +305,19 @@ notification_types:
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/pending_registration_admin.html.mako'
- - name: node_embargo_admin
+ - name: node_pending_retraction_admin
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/pending_retraction_admin.html.mako'
+ - name: node_pending_retraction_non_admin
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/pending_retraction_non_admin.html.mako'
+ - name: node_pending_embargo_admin
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/pending_embargo_admin.html.mako'
- - name: node_embargo_nonadmin
+ - name: node_pending_embargo_non_admin
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/pending_embargo_non_admin.html.mako'
@@ -330,6 +370,26 @@ notification_types:
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/updates_rejected.html.mako'
+ - name: node_archive_file_not_found_desk
+ subject: 'Problem registering ${unescape_entities(src.title)}'
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/archive_file_not_found_desk.html.mako'
+ - name: node_archive_file_not_found_user
+ subject: 'Registration failed because of altered files'
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/archive_file_not_found_user.html.mako'
+ - name: node_archive_uncaught_error_desk
+ subject: 'Problem registering ${unescape_entities(src.title)}'
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/archive_uncaught_error_desk.html.mako'
+ - name: node_archive_registration_stuck_desk
+ subject: '[auto] Stuck registrations audit'
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/archive_registration_stuck_desk.html.mako'
+ - name: node_archive_success
+ subject: 'Registration of ${unescape_entities(src.title)} complete'
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/archive_success.html.mako'
#### PREPRINT
- name: preprint_contributor_added_preprint_node_from_osf
@@ -349,6 +409,16 @@ notification_types:
__docs__: ...
object_content_type_model_name: preprint
template: 'website/templates/emails/contributor_added_preprints.html.mako'
+ - name: preprint_withdrawal_request_granted
+ subject: 'Your ${document_type} has been withdrawn'
+ object_content_type_model_name: preprint
+ template: 'website/templates/emails/withdrawal_request_granted.html.mako'
+ - name: preprint_withdrawal_request_declined
+ subject: 'Your withdrawal request has been declined'
+ object_content_type_model_name: preprint
+ template: 'website/templates/emails/withdrawal_request_declined.html.mako'
+
+
#### SUPPORT
- name: crossref_error
__docs__: ...
diff --git a/osf/models/notification_type.py b/osf/models/notification_type.py
index 08435b7441e..48f82cbb685 100644
--- a/osf/models/notification_type.py
+++ b/osf/models/notification_type.py
@@ -89,6 +89,7 @@ class Type(str, Enum):
NODE_PENDING_EMBARGO_ADMIN = 'node_pending_embargo_admin'
NODE_PENDING_EMBARGO_NON_ADMIN = 'node_pending_embargo_non_admin'
NODE_PENDING_RETRACTION_NON_ADMIN = 'node_pending_retraction_non_admin'
+ NODE_PENDING_RETRACTION_ADMIN = 'node_pending_retraction_admin'
NODE_PENDING_REGISTRATION_NON_ADMIN = 'node_pending_registration_non_admin'
NODE_PENDING_REGISTRATION_ADMIN = 'node_pending_registration_admin'
NODE_PENDING_EMBARGO_TERMINATION_NON_ADMIN = 'node_pending_embargo_termination_non_admin'
diff --git a/osf/models/sanctions.py b/osf/models/sanctions.py
index 9f072eaaeb4..a0f12bcc6d9 100644
--- a/osf/models/sanctions.py
+++ b/osf/models/sanctions.py
@@ -405,22 +405,6 @@ def _send_approval_request_email(self, user, template, context):
def _email_template_context(self, user, node, is_authorizer=False):
return {}
- def _notify_authorizer(self, authorizer, node):
- return NotificationType.objects.get(name=self.AUTHORIZER_NOTIFY_EMAIL_TYPE).emit(
- user=authorizer,
- event_context=self._email_template_context(
- authorizer,
- node,
- is_authorizer=True
- )
- )
-
- def _notify_non_authorizer(self, user, node):
- return NotificationType.objects.get(name=self.NON_AUTHORIZER_NOTIFY_EMAIL_TYPE).emit(
- user=user,
- event_context=self._email_template_context(user, node)
- )
-
def ask(self, group):
"""
:param list group: List of (user, node) tuples containing contributors to notify about the
@@ -430,9 +414,19 @@ def ask(self, group):
return
for contrib, node in group:
if contrib._id in self.approval_state:
- self._notify_authorizer(contrib, node)
+ return NotificationType.objects.get(name=self.AUTHORIZER_NOTIFY_EMAIL_TYPE).emit(
+ user=contrib,
+ event_context=self._email_template_context(
+ contrib,
+ node,
+ is_authorizer=True
+ )
+ )
else:
- self._notify_non_authorizer(contrib, node)
+ return NotificationType.objects.get(name=self.NON_AUTHORIZER_NOTIFY_EMAIL_TYPE).emit(
+ user=contrib,
+ event_context=self._email_template_context(contrib, node)
+ )
def add_authorizer(self, user, node, **kwargs):
super().add_authorizer(user, node, **kwargs)
@@ -650,8 +644,8 @@ class Retraction(EmailApprovableSanction):
DISPLAY_NAME = 'Retraction'
SHORT_NAME = 'retraction'
- AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_REGISTRATION_ADMIN
- NON_AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_REGISTRATION_NON_ADMIN
+ AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_RETRACTION_ADMIN
+ NON_AUTHORIZER_NOTIFY_EMAIL_TYPE = NotificationType.Type.NODE_PENDING_RETRACTION_NON_ADMIN
VIEW_URL_TEMPLATE = VIEW_PROJECT_URL_TEMPLATE
APPROVE_URL_TEMPLATE = osf_settings.DOMAIN + 'token_action/{node_id}/?token={token}'
diff --git a/osf_tests/test_comment.py b/osf_tests/test_comment.py
index 62a295367fd..27cd5bced2a 100644
--- a/osf_tests/test_comment.py
+++ b/osf_tests/test_comment.py
@@ -3,20 +3,10 @@
import pytest
import datetime
from django.utils import timezone
-from collections import OrderedDict
-
-from addons.box.models import BoxFile
-from addons.dropbox.models import DropboxFile
-from addons.github.models import GithubFile
-from addons.googledrive.models import GoogleDriveFile
-from addons.osfstorage.models import OsfStorageFile
-from addons.s3.models import S3File
from website import settings
-from addons.osfstorage import settings as osfstorage_settings
-from website.project.views.comment import update_file_guid_referent
from framework.exceptions import PermissionsError
from tests.base import capture_signals
-from osf.models import Comment, NodeLog, Guid, BaseFileNode
+from osf.models import Comment, NodeLog, Guid
from osf.utils import permissions
from framework.auth.core import Auth
from .factories import (
@@ -395,676 +385,3 @@ def test_find_unread_does_not_include_deleted_comments(self):
CommentFactory(node=project, user=project.creator, is_deleted=True)
n_unread = Comment.find_n_unread(user=user, node=project, page='node')
assert n_unread == 0
-
-
-# copied from tests/test_comments.py
-class FileCommentMoveRenameTestMixin:
- id_based_providers = ['osfstorage']
-
- @pytest.fixture()
- def project(self, user):
- p = ProjectFactory(creator=user)
- p_settings = p.get_or_add_addon(self.provider, Auth(user))
- p_settings.folder = '/Folder1'
- p_settings.save()
- p.save()
- return p
-
- @pytest.fixture()
- def component(self, user, project):
- c = NodeFactory(parent=project, creator=user)
- c_settings = c.get_or_add_addon(self.provider, Auth(user))
- c_settings.folder = '/Folder2'
- c_settings.save()
- c.save()
- return c
-
- @property
- def provider(self):
- raise NotImplementedError
-
- @property
- def ProviderFile(self):
- raise NotImplementedError
-
- @classmethod
- def _format_path(cls, path, file_id=None):
- return path
-
- def _create_source_payload(self, path, node, provider, file_id=None):
- return OrderedDict([('materialized', path),
- ('name', path.split('/')[-1]),
- ('nid', node._id),
- ('path', self._format_path(path, file_id)),
- ('provider', provider),
- ('url', '/project/{}/files/{}/{}/'.format(node._id, provider, path.strip('/'))),
- ('node', {'url': f'/{node._id}/', '_id': node._id, 'title': node.title}),
- ('addon', provider)])
-
- def _create_destination_payload(self, path, node, provider, file_id, children=None):
- destination_path = PROVIDER_CLASS.get(provider)._format_path(path=path, file_id=file_id)
- destination = OrderedDict([('contentType', ''),
- ('etag', 'abcdefghijklmnop'),
- ('extra', OrderedDict([('revisionId', '12345678910')])),
- ('kind', 'file'),
- ('materialized', path),
- ('modified', 'Tue, 02 Feb 2016 17:55:48 +0000'),
- ('name', path.split('/')[-1]),
- ('nid', node._id),
- ('path', destination_path),
- ('provider', provider),
- ('size', 1000),
- ('url', '/project/{}/files/{}/{}/'.format(node._id, provider, path.strip('/'))),
- ('node', {'url': f'/{node._id}/', '_id': node._id, 'title': node.title}),
- ('addon', provider)])
- if children:
- destination_children = [self._create_destination_payload(child['path'], child['node'], child['provider'], file_id) for child in children]
- destination.update({'children': destination_children})
- return destination
-
- def _create_payload(self, action, user, source, destination, file_id, destination_file_id=None):
- return OrderedDict([
- ('action', action),
- ('auth', OrderedDict([('email', user.username), ('id', user._id), ('name', user.fullname)])),
- ('destination', self._create_destination_payload(path=destination['path'],
- node=destination['node'],
- provider=destination['provider'],
- file_id=destination_file_id or file_id,
- children=destination.get('children', []))),
- ('source', self._create_source_payload(source['path'], source['node'], source['provider'], file_id=file_id)),
- ('time', 100000000),
- ('node', source['node']),
- ('project', None)
- ])
-
- def _create_file_with_comment(self, node, path, user):
- self.file = self.ProviderFile.create(
- target=node,
- path=path,
- name=path.strip('/'),
- materialized_path=path)
- self.file.save()
- self.guid = self.file.get_guid(create=True)
- self.comment = CommentFactory(user=user, node=node, target=self.guid)
-
- def test_comments_move_on_file_rename(self, project, user):
- source = {
- 'path': '/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/file_renamed.txt',
- 'node': project,
- 'provider': self.provider
- }
- self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_on_folder_rename(self, project, user):
- source = {
- 'path': '/subfolder1/',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder2/',
- 'node': project,
- 'provider': self.provider
- }
- file_name = 'file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_on_subfolder_file_when_parent_folder_is_renamed(self, project, user):
- source = {
- 'path': '/subfolder1/',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder2/',
- 'node': project,
- 'provider': self.provider
- }
- file_path = 'sub-subfolder/file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_path), user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_path), file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_file_moved_to_subfolder(self, project, user):
- source = {
- 'path': '/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_file_moved_from_subfolder_to_root(self, project, user):
- source = {
- 'path': '/subfolder/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_file_moved_from_project_to_component(self, project, component, user):
- source = {
- 'path': '/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/file.txt',
- 'node': component,
- 'provider': self.provider
- }
- self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- assert self.guid.referent.target._id == destination['node']._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_file_moved_from_component_to_project(self, project, component, user):
- source = {
- 'path': '/file.txt',
- 'node': component,
- 'provider': self.provider
- }
- destination = {
- 'path': '/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- assert self.guid.referent.target._id == destination['node']._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_folder_moved_to_subfolder(self, user, project):
- source = {
- 'path': '/subfolder/',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder2/subfolder/',
- 'node': project,
- 'provider': self.provider
- }
- file_name = 'file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_folder_moved_from_subfolder_to_root(self, project, user):
- source = {
- 'path': '/subfolder2/subfolder/',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder/',
- 'node': project,
- 'provider': self.provider
- }
- file_name = 'file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_folder_moved_from_project_to_component(self, project, component, user):
- source = {
- 'path': '/subfolder/',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder/',
- 'node': component,
- 'provider': self.provider
- }
- file_name = 'file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_folder_moved_from_component_to_project(self, project, component, user):
- source = {
- 'path': '/subfolder/',
- 'node': component,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder/',
- 'node': project,
- 'provider': self.provider
- }
- file_name = 'file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_file_moved_to_osfstorage(self, project, user):
- osfstorage = project.get_addon('osfstorage')
- root_node = osfstorage.get_root()
- osf_file = root_node.append_file('file.txt')
- osf_file.create_version(user, {
- 'object': '06d80e',
- 'service': 'cloud',
- osfstorage_settings.WATERBUTLER_RESOURCE: 'osf',
- }, {
- 'size': 1337,
- 'contentType': 'img/png',
- 'etag': 'abcdefghijklmnop'
- }).save()
-
- source = {
- 'path': '/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': osf_file.path,
- 'node': project,
- 'provider': 'osfstorage'
- }
- self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id, destination_file_id=destination['path'].strip('/'))
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class('osfstorage', BaseFileNode.FILE).get_or_create(destination['node'], destination['path'])
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_folder_moved_to_osfstorage(self, project, user):
- osfstorage = project.get_addon('osfstorage')
- root_node = osfstorage.get_root()
- osf_folder = root_node.append_folder('subfolder')
- osf_file = osf_folder.append_file('file.txt')
- osf_file.create_version(user, {
- 'object': '06d80e',
- 'service': 'cloud',
- osfstorage_settings.WATERBUTLER_RESOURCE: 'osf',
- }, {
- 'size': 1337,
- 'contentType': 'img/png',
- 'etag': '1234567890abcde'
- }).save()
-
- source = {
- 'path': '/subfolder/',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder/',
- 'node': project,
- 'provider': 'osfstorage',
- 'children': [{
- 'path': '/subfolder/file.txt',
- 'node': project,
- 'provider': 'osfstorage'
- }]
- }
- file_name = 'file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id, destination_file_id=osf_file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class('osfstorage', BaseFileNode.FILE).get_or_create(destination['node'], osf_file._id)
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- @pytest.mark.parametrize(
- ['destination_provider', 'destination_path'],
- [('box', '/1234567890'), ('dropbox', '/file.txt'), ('github', '/file.txt'), ('googledrive', '/file.txt'), ('s3', '/file.txt')]
- )
- def test_comments_move_when_file_moved_to_different_provider(self, destination_provider, destination_path, project, user):
- if self.provider == destination_provider:
- assert True
- return
-
- project.add_addon(destination_provider, auth=Auth(user))
- project.save()
- self.addon_settings = project.get_addon(destination_provider)
- self.addon_settings.folder = '/AddonFolder'
- self.addon_settings.save()
-
- source = {
- 'path': '/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': destination_path,
- 'node': project,
- 'provider': destination_provider
- }
- self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(destination_provider, BaseFileNode.FILE).get_or_create(destination['node'], destination['path'])
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- @pytest.mark.parametrize(
- ['destination_provider', 'destination_path'],
- [('box', '/1234567890'), ('dropbox', '/subfolder/file.txt'), ('github', '/subfolder/file.txt'), ('googledrive', '/subfolder/file.txt'), ('s3', '/subfolder/file.txt'), ]
- )
- def test_comments_move_when_folder_moved_to_different_provider(self, destination_provider, destination_path, project, user):
- if self.provider == destination_provider:
- assert True
- return
-
- project.add_addon(destination_provider, auth=Auth(user))
- project.save()
- self.addon_settings = project.get_addon(destination_provider)
- self.addon_settings.folder = '/AddonFolder'
- self.addon_settings.save()
-
- source = {
- 'path': '/',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder/',
- 'node': project,
- 'provider': destination_provider,
- 'children': [{
- 'path': '/subfolder/file.txt',
- 'node': project,
- 'provider': destination_provider
- }]
- }
- file_name = 'file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(destination_provider, BaseFileNode.FILE).get_or_create(destination['node'], destination_path)
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
-
-# copied from tests/test_comments.py
-class TestOsfstorageFileCommentMoveRename(FileCommentMoveRenameTestMixin):
-
- provider = 'osfstorage'
- ProviderFile = OsfStorageFile
-
- @classmethod
- def _format_path(cls, path, file_id=None):
- super()._format_path(path)
- return '/{}{}'.format(file_id, ('/' if path.endswith('/') else ''))
-
- def _create_file_with_comment(self, node, path, user):
- osfstorage = node.get_addon(self.provider)
- root_node = osfstorage.get_root()
- self.file = root_node.append_file('file.txt')
- self.file.create_version(user, {
- 'object': '06d80e',
- 'service': 'cloud',
- osfstorage_settings.WATERBUTLER_RESOURCE: 'osf',
- }, {
- 'size': 1337,
- 'contentType': 'img/png',
- 'etag': 'abcdefghijklmnop'
- }).save()
- self.file.materialized_path = path
- self.guid = self.file.get_guid(create=True)
- self.comment = CommentFactory(user=user, node=node, target=self.guid)
-
- def test_comments_move_when_file_moved_from_project_to_component(self, project, component, user):
- source = {
- 'path': '/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/file.txt',
- 'node': component,
- 'provider': self.provider
- }
- self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
- self.file.move_under(destination['node'].get_addon(self.provider).get_root())
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- assert self.guid.referent.target._id == destination['node']._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_file_moved_from_component_to_project(self, project, component, user):
- source = {
- 'path': '/file.txt',
- 'node': component,
- 'provider': self.provider
- }
- destination = {
- 'path': '/file.txt',
- 'node': project,
- 'provider': self.provider
- }
- self._create_file_with_comment(node=source['node'], path=source['path'], user=user)
- self.file.move_under(destination['node'].get_addon(self.provider).get_root())
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path(destination['path'], file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- assert self.guid.referent.target._id == destination['node']._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_folder_moved_from_project_to_component(self, project, component, user):
- source = {
- 'path': '/subfolder/',
- 'node': project,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder/',
- 'node': component,
- 'provider': self.provider
- }
- file_name = 'file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
- self.file.move_under(destination['node'].get_addon(self.provider).get_root())
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_folder_moved_from_component_to_project(self, project, component, user):
- source = {
- 'path': '/subfolder/',
- 'node': component,
- 'provider': self.provider
- }
- destination = {
- 'path': '/subfolder/',
- 'node': project,
- 'provider': self.provider
- }
- file_name = 'file.txt'
- self._create_file_with_comment(node=source['node'], path='{}{}'.format(source['path'], file_name), user=user)
- self.file.move_under(destination['node'].get_addon(self.provider).get_root())
- payload = self._create_payload('move', user, source, destination, self.file._id)
- update_file_guid_referent(self=None, target=destination['node'], payload=payload)
- self.guid.reload()
-
- file_node = BaseFileNode.resolve_class(self.provider, BaseFileNode.FILE).get_or_create(destination['node'], self._format_path('{}{}'.format(destination['path'], file_name), file_id=self.file._id))
- assert self.guid._id == file_node.get_guid()._id
- file_comments = Comment.objects.filter(root_target=self.guid.pk)
- assert file_comments.count() == 1
-
- def test_comments_move_when_file_moved_to_osfstorage(self):
- # Already in OSFStorage
- pass
-
- def test_comments_move_when_folder_moved_to_osfstorage(self):
- # Already in OSFStorage
- pass
-
-# copied from tests/test_comments.py
-class TestBoxFileCommentMoveRename(FileCommentMoveRenameTestMixin):
-
- provider = 'box'
- ProviderFile = BoxFile
-
- def _create_file_with_comment(self, node, path, user):
- self.file = self.ProviderFile.create(
- target=node,
- path=self._format_path(path),
- name=path.strip('/'),
- materialized_path=path)
- self.file.save()
- self.guid = self.file.get_guid(create=True)
- self.comment = CommentFactory(user=user, node=node, target=self.guid)
-
- @classmethod
- def _format_path(cls, path, file_id=None):
- super()._format_path(path)
- return '/9876543210/' if path.endswith('/') else '/1234567890'
-
-
-class TestDropboxFileCommentMoveRename(FileCommentMoveRenameTestMixin):
-
- provider = 'dropbox'
- ProviderFile = DropboxFile
-
- def _create_file_with_comment(self, node, path, user):
- self.file = self.ProviderFile.create(
- target=node,
- path=f'{node.get_addon(self.provider).folder}{path}',
- name=path.strip('/'),
- materialized_path=path)
- self.file.save()
- self.guid = self.file.get_guid(create=True)
- self.comment = CommentFactory(user=user, node=node, target=self.guid)
-
-
-class TestGoogleDriveFileCommentMoveRename(FileCommentMoveRenameTestMixin):
-
- provider = 'googledrive'
- ProviderFile = GoogleDriveFile
-
-class TestGithubFileCommentMoveRename(FileCommentMoveRenameTestMixin):
-
- provider = 'github'
- ProviderFile = GithubFile
-
-class TestS3FileCommentMoveRename(FileCommentMoveRenameTestMixin):
-
- provider = 's3'
- ProviderFile = S3File
-
-
-PROVIDER_CLASS = {
- 'osfstorage': TestOsfstorageFileCommentMoveRename,
- 'box': TestBoxFileCommentMoveRename,
- 'dropbox': TestDropboxFileCommentMoveRename,
- 'github': TestGithubFileCommentMoveRename,
- 'googledrive': TestGoogleDriveFileCommentMoveRename,
- 's3': TestS3FileCommentMoveRename
-
-}
diff --git a/osf_tests/test_node.py b/osf_tests/test_node.py
index 3b04ceba292..e6a34c31050 100644
--- a/osf_tests/test_node.py
+++ b/osf_tests/test_node.py
@@ -2131,14 +2131,14 @@ def test_set_privacy_sends_mail_default(self, node, auth):
node.set_privacy('private', auth=auth)
node.set_privacy('public', auth=auth)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ assert notifications[0]['type'] == NotificationType.Type.USER_NEW_PUBLIC_PROJECT
def test_set_privacy_sends_mail(self, node, auth):
with capture_notifications() as notifications:
node.set_privacy('private', auth=auth)
node.set_privacy('public', auth=auth, meeting_creation=False)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT
+ assert notifications[0]['type'] == NotificationType.Type.USER_NEW_PUBLIC_PROJECT
def test_set_privacy_skips_mail_if_meeting(self, node, auth):
with capture_notifications() as notifications:
diff --git a/tests/test_preprints.py b/tests/test_preprints.py
index b3c97ece060..b1920669a1f 100644
--- a/tests/test_preprints.py
+++ b/tests/test_preprints.py
@@ -996,9 +996,9 @@ def test_check_spam_on_private_preprint_bans_new_spam_user(self, preprint, user)
preprint3.reload()
assert preprint3.is_public is True
- @mock.patch('website.mailchimp_utils.unsubscribe_mailchimp')
@mock.patch.object(settings, 'SPAM_SERVICES_ENABLED', True)
@mock.patch.object(settings, 'SPAM_ACCOUNT_SUSPENSION_ENABLED', True)
+ @mock.patch('website.mailchimp_utils.unsubscribe_mailchimp')
def test_check_spam_on_private_preprint_does_not_ban_existing_user(self, mock_mailchimp, preprint, user):
preprint.is_public = False
preprint.save()
diff --git a/tests/test_registrations/test_retractions.py b/tests/test_registrations/test_retractions.py
index 5874fad6fa6..280b0efd2a3 100644
--- a/tests/test_registrations/test_retractions.py
+++ b/tests/test_registrations/test_retractions.py
@@ -805,7 +805,7 @@ def test_POST_retraction_does_not_send_email_to_unregistered_admins(self):
auth=self.user.auth,
)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[0]['type'] == NotificationType.Type.NODE_PENDING_REGISTRATION_ADMIN
def test_POST_pending_embargo_returns_HTTPError_HTTPOK(self):
self.registration.embargo_registration(
@@ -897,7 +897,7 @@ def test_valid_POST_calls_send_mail_with_username(self):
auth=self.user.auth,
)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[0]['type'] == NotificationType.Type.NODE_PENDING_RETRACTION_ADMIN
def test_non_contributor_GET_approval_returns_HTTPError_FORBIDDEN(self):
non_contributor = AuthUserFactory()
diff --git a/website/mails/mails.py b/website/mails/mails.py
index 4d47b52b5cb..7ca0da552ad 100644
--- a/website/mails/mails.py
+++ b/website/mails/mails.py
@@ -133,10 +133,6 @@ def get_english_article(word):
'pending_retraction_non_admin',
subject='Withdrawal pending for one of your registrations.'
)
-PENDING_RETRACTION_NON_ADMIN = Mail(
- 'pending_retraction_non_admin',
- subject='Withdrawal pending for one of your projects.'
-)
# Embargo related Mail objects
PENDING_EMBARGO_ADMIN = Mail(
'pending_embargo_admin',
From f281e994e3e6d27379b019c1c138dfc6b4c0e51b Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 22:57:59 -0400
Subject: [PATCH 151/336] remove mails.py
---
api_tests/mailhog/test_mailhog.py | 5 -
conftest.py | 2 -
notifications.yaml | 2 +-
.../commands/populate_notification_types.py | 2 +-
osf_tests/test_archiver.py | 3 -
osf_tests/test_merging_users.py | 1 -
scripts/create_fakes.py | 1 -
tests/base.py | 1 -
tests/test_auth.py | 1 -
tests/test_auth_views.py | 1 -
tests/test_misc_views.py | 1 -
tests/test_preprints.py | 1 -
tests/test_registrations/test_embargoes.py | 5 +-
tests/test_registrations/test_retractions.py | 3 +-
website/mails/__init__.py | 1 -
website/mails/mails.py | 317 ------------------
website/settings/local-dist.py | 3 -
17 files changed, 5 insertions(+), 345 deletions(-)
delete mode 100644 website/mails/__init__.py
delete mode 100644 website/mails/mails.py
diff --git a/api_tests/mailhog/test_mailhog.py b/api_tests/mailhog/test_mailhog.py
index fb9b8fba771..d21a0f37fa9 100644
--- a/api_tests/mailhog/test_mailhog.py
+++ b/api_tests/mailhog/test_mailhog.py
@@ -12,7 +12,6 @@
fake
)
from framework import auth
-from unittest import mock
from osf.models import OSFUser, NotificationType
from tests.base import (
OsfTestCase,
@@ -23,7 +22,6 @@
@pytest.mark.django_db
class TestMailHog:
- @mock.patch('website.mails.settings.ENABLE_TEST_EMAIL', True)
def test_mailhog_received_mail(self):
with override_switch(features.ENABLE_MAILHOG, active=True):
mailhog_v1 = f'{settings.MAILHOG_API_HOST}/api/v1/messages'
@@ -33,7 +31,6 @@ def test_mailhog_received_mail(self):
NotificationType.objects.get(
name=NotificationType.Type.USER_REGISTRATION_BULK_UPLOAD_FAILURE_ALL
).emit(
- user=None,
destination_address='to_addr@mail.com',
event_context={
'fullname': '',
@@ -53,8 +50,6 @@ def test_mailhog_received_mail(self):
@pytest.mark.django_db
-@mock.patch('website.mails.settings.ENABLE_TEST_EMAIL', True)
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestAuthMailhog(OsfTestCase):
def setUp(self):
diff --git a/conftest.py b/conftest.py
index b30cb6271a1..79c23380e63 100644
--- a/conftest.py
+++ b/conftest.py
@@ -35,7 +35,6 @@ def pytest_configure(config):
'framework.auth.core',
'website.app',
'website.archiver.tasks',
- 'website.mails',
'website.notifications.listeners',
'website.search.elastic_search',
'website.search_migration.migrate',
@@ -66,7 +65,6 @@ def override_settings():
website_settings.SHARE_ENABLED = False
# Set this here instead of in SILENT_LOGGERS, in case developers
# call setLevel in local.py
- logging.getLogger('website.mails.mails').setLevel(logging.CRITICAL)
@pytest.fixture()
diff --git a/notifications.yaml b/notifications.yaml
index 0a8b559f10f..1cc60553ac9 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -218,7 +218,7 @@ notification_types:
- name: user_confirm_email_agu_conference
subject: 'OSF Account Verification, from the American Geophysical Union Conference'
object_content_type_model_name: osfuser
- template: 'website/templates/emails/confirm_agu_conference.html.mako'
+ template: 'website/templates/emails/confirm_erpc.html.mako'
- name: user_confirm_email_registries_osf
subject: 'OSF Account Verification, OSF Registries'
object_content_type_model_name: osfuser
diff --git a/osf/management/commands/populate_notification_types.py b/osf/management/commands/populate_notification_types.py
index a65b3f081ff..4e5d8921e59 100644
--- a/osf/management/commands/populate_notification_types.py
+++ b/osf/management/commands/populate_notification_types.py
@@ -21,7 +21,7 @@ def populate_notification_types(*args, **kwargs):
with open(settings.NOTIFICATION_TYPES_YAML) as stream:
notification_types = yaml.safe_load(stream)
for notification_type in notification_types['notification_types']:
- notification_type.pop('__docs__')
+ notification_type.pop('__docs__', None)
object_content_type_model_name = notification_type.pop('object_content_type_model_name')
if object_content_type_model_name == 'desk':
diff --git a/osf_tests/test_archiver.py b/osf_tests/test_archiver.py
index 4d0491b21c9..bc5efc2c3f9 100644
--- a/osf_tests/test_archiver.py
+++ b/osf_tests/test_archiver.py
@@ -720,7 +720,6 @@ def test_archive_success_same_file_in_component(self):
assert child_reg._id in question['extra'][0]['viewUrl']
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestArchiverUtils(ArchiverTestCase):
def test_handle_archive_fail(self):
@@ -849,7 +848,6 @@ def test_get_file_map_memoization(self):
archiver_utils.get_file_map(node)
assert mock_get_file_tree.call_count == call_count
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestArchiverListeners(ArchiverTestCase):
@mock.patch('website.archiver.tasks.archive')
@@ -1082,7 +1080,6 @@ def test_find_failed_registrations(self):
assert pk not in failed
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestArchiverBehavior(OsfTestCase):
@mock.patch('osf.models.AbstractNode.update_search')
diff --git a/osf_tests/test_merging_users.py b/osf_tests/test_merging_users.py
index 9317260fb1b..ce2cd71cbdd 100644
--- a/osf_tests/test_merging_users.py
+++ b/osf_tests/test_merging_users.py
@@ -28,7 +28,6 @@
@pytest.mark.enable_implicit_clean
@pytest.mark.enable_bookmark_creation
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestUserMerging(OsfTestCase):
def setUp(self):
super().setUp()
diff --git a/scripts/create_fakes.py b/scripts/create_fakes.py
index 8b4db177de7..379331f24bc 100644
--- a/scripts/create_fakes.py
+++ b/scripts/create_fakes.py
@@ -256,7 +256,6 @@ def science_text(cls, max_nb_chars=200):
logger = logging.getLogger('create_fakes')
SILENT_LOGGERS = [
'factory',
- 'website.mails',
]
for logger_name in SILENT_LOGGERS:
logging.getLogger(logger_name).setLevel(logging.CRITICAL)
diff --git a/tests/base.py b/tests/base.py
index 1eacefc066d..35ae00bb445 100644
--- a/tests/base.py
+++ b/tests/base.py
@@ -53,7 +53,6 @@ def get_default_metaschema():
'framework.auth.core',
'website.app',
'website.archiver.tasks',
- 'website.mails',
'website.notifications.listeners',
'website.search.elastic_search',
'website.search_migration.migrate',
diff --git a/tests/test_auth.py b/tests/test_auth.py
index 25068c024c8..05f6d243e33 100644
--- a/tests/test_auth.py
+++ b/tests/test_auth.py
@@ -43,7 +43,6 @@
logger = logging.getLogger(__name__)
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestAuthUtils(OsfTestCase):
def setUp(self):
diff --git a/tests/test_auth_views.py b/tests/test_auth_views.py
index 7f2b4c4136a..ca4476d17d3 100644
--- a/tests/test_auth_views.py
+++ b/tests/test_auth_views.py
@@ -43,7 +43,6 @@
pytestmark = pytest.mark.django_db
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestAuthViews(OsfTestCase):
def setUp(self):
diff --git a/tests/test_misc_views.py b/tests/test_misc_views.py
index d9c735b97dd..78596d1eef2 100644
--- a/tests/test_misc_views.py
+++ b/tests/test_misc_views.py
@@ -361,7 +361,6 @@ def test_explore(self):
assert res.status_code == 200
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestExternalAuthViews(OsfTestCase):
def setUp(self):
diff --git a/tests/test_preprints.py b/tests/test_preprints.py
index b1920669a1f..724dda3b0ae 100644
--- a/tests/test_preprints.py
+++ b/tests/test_preprints.py
@@ -1983,7 +1983,6 @@ def test_update_or_enqueue_on_preprint_doi_created(self):
assert should_update_preprint_identifiers(self.private_preprint, {})
-@mock.patch('website.mails.settings.USE_CELERY', False)
class TestPreprintConfirmationEmails(OsfTestCase):
def setUp(self):
super().setUp()
diff --git a/tests/test_registrations/test_embargoes.py b/tests/test_registrations/test_embargoes.py
index 7b06887c86b..7e2a7b71971 100644
--- a/tests/test_registrations/test_embargoes.py
+++ b/tests/test_registrations/test_embargoes.py
@@ -1060,7 +1060,6 @@ def test_GET_from_authorized_user_with_registration_rej_token_deleted_node(self)
@pytest.mark.enable_bookmark_creation
-@mock.patch('website.mails.settings.USE_CELERY', False)
class RegistrationEmbargoViewsTestCase(OsfTestCase):
def setUp(self):
super().setUp()
@@ -1156,8 +1155,8 @@ def test_embargoed_registration_set_privacy_sends_mail(self):
for contributor in self.registration.contributors:
if Contributor.objects.get(user_id=contributor.id, node_id=self.registration.id).permission == permissions.ADMIN:
admin_contributors.append(contributor)
- for admin in admin_contributors:
- assert any([each['kwargs']['user'] == admin for each in notifications])
+
+ assert all([each['kwargs']['user'] in admin_contributors for each in notifications])
@mock.patch('osf.models.sanctions.EmailApprovableSanction.ask')
def test_make_child_embargoed_registration_public_asks_all_admins_in_tree(self, mock_ask):
diff --git a/tests/test_registrations/test_retractions.py b/tests/test_registrations/test_retractions.py
index 280b0efd2a3..a477baeaaed 100644
--- a/tests/test_registrations/test_retractions.py
+++ b/tests/test_registrations/test_retractions.py
@@ -752,7 +752,6 @@ def test_POST_retraction_to_subproject_component_returns_HTTPError_BAD_REQUEST(s
@pytest.mark.enable_bookmark_creation
@pytest.mark.usefixtures('mock_gravy_valet_get_verified_links')
-@mock.patch('website.mails.settings.USE_CELERY', False)
class RegistrationRetractionViewsTestCase(OsfTestCase):
def setUp(self):
super().setUp()
@@ -805,7 +804,7 @@ def test_POST_retraction_does_not_send_email_to_unregistered_admins(self):
auth=self.user.auth,
)
assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_PENDING_REGISTRATION_ADMIN
+ assert notifications[0]['type'] == NotificationType.Type.NODE_PENDING_RETRACTION_ADMIN
def test_POST_pending_embargo_returns_HTTPError_HTTPOK(self):
self.registration.embargo_registration(
diff --git a/website/mails/__init__.py b/website/mails/__init__.py
deleted file mode 100644
index 1ed0bb2c90a..00000000000
--- a/website/mails/__init__.py
+++ /dev/null
@@ -1 +0,0 @@
-from .mails import * # noqa
diff --git a/website/mails/mails.py b/website/mails/mails.py
deleted file mode 100644
index 7ca0da552ad..00000000000
--- a/website/mails/mails.py
+++ /dev/null
@@ -1,317 +0,0 @@
-"""OSF mailing utilities.
-
-Email templates go in website/templates/emails
-Templates must end in ``.txt.mako`` for plaintext emails or``.html.mako`` for html emails.
-
-You can then create a `Mail` object given the basename of the template and
-the email subject. ::
-
- CONFIRM_EMAIL = Mail(tpl_prefix='confirm', subject="Confirm your email address")
-
-You can then use ``send_mail`` to send the email.
-
-Usage: ::
-
- from website import mails
- ...
- mails.send_mail('foo@bar.com', mails.CONFIRM_EMAIL, user=user)
-
-"""
-import os
-import logging
-
-from mako.lookup import TemplateLookup, Template
-
-from website import settings
-
-logger = logging.getLogger(__name__)
-
-EMAIL_TEMPLATES_DIR = os.path.join(settings.TEMPLATES_PATH, 'emails')
-
-_tpl_lookup = TemplateLookup(
- directories=[EMAIL_TEMPLATES_DIR],
-)
-
-HTML_EXT = '.html.mako'
-
-class Mail:
- """An email object.
-
- :param str tpl_prefix: The template name prefix.
- :param str subject: The subject of the email.
- :param iterable categories: Categories to add to the email using SendGrid's
- SMTPAPI. Used for email analytics.
- See https://sendgrid.com/docs/User_Guide/Statistics/categories.html
- :param: bool engagement: Whether this is an engagement email that can be disabled with
- the disable_engagement_emails waffle flag
- """
-
- def __init__(self, tpl_prefix, subject, categories=None, engagement=False):
- self.tpl_prefix = tpl_prefix
- self._subject = subject
- self.categories = categories
- self.engagement = engagement
-
- def html(self, **context):
- """Render the HTML email message."""
- tpl_name = self.tpl_prefix + HTML_EXT
- return render_message(tpl_name, **context)
-
- def subject(self, **context):
- return Template(self._subject).render(**context)
-
-
-def render_message(tpl_name, **context):
- """Render an email message."""
- tpl = _tpl_lookup.get_template(tpl_name)
- return tpl.render(**context)
-
-def get_english_article(word):
- """
- Decide whether to use 'a' or 'an' for a given English word.
-
- :param word: the word immediately after the article
- :return: 'a' or 'an'
- """
- return 'a' + ('n' if word[0].lower() in 'aeiou' else '')
-
-
-# Predefined Emails
-CONFIRM_EMAIL_ERPC = Mail(
- 'confirm_erpc',
- subject='OSF Account Verification, Election Research Preacceptance Competition'
-)
-CONFIRM_EMAIL_AGU_CONFERENCE = Mail(
- 'confirm_agu_conference',
- subject='OSF Account Verification, from the American Geophysical Union Conference'
-)
-CONFIRM_EMAIL_PREPRINTS = lambda name, provider: Mail(
- f'confirm_preprints_{name}',
- subject=f'OSF Account Verification, {provider}'
-)
-CONFIRM_EMAIL_REGISTRIES_OSF = Mail(
- 'confirm_registries_osf',
- subject='OSF Account Verification, OSF Registries'
-)
-
-# Merge account, add or remove email confirmation emails.
-CONFIRM_MERGE = Mail('confirm_merge', subject='Confirm account merge')
-PRIMARY_EMAIL_CHANGED = Mail('primary_email_changed', subject='Primary email changed')
-
-
-# Contributor added confirmation emails
-INVITE_PREPRINT = lambda provider: Mail(
- 'invite_preprints',
- subject=f'You have been added as a contributor to {get_english_article(provider.name)} {provider.name} {provider.preprint_word}.'
-)
-CONTRIBUTOR_ADDED_PREPRINT = lambda provider: Mail(
- 'contributor_added_preprints',
- subject=f'You have been added as a contributor to {get_english_article(provider.name)} {provider.name} {provider.preprint_word}.'
-)
-MODERATOR_ADDED = lambda provider: Mail(
- 'moderator_added',
- subject=f'You have been added as a moderator for {provider.name}'
-)
-CONTRIBUTOR_ADDED_ACCESS_REQUEST = Mail(
- 'contributor_added_access_request',
- subject='Your access request to an OSF project has been approved'
-)
-REQUEST_EXPORT = Mail('support_request', subject='[via OSF] Export Request')
-
-SPAM_USER_BANNED = Mail('spam_user_banned', subject='[OSF] Account flagged as spam')
-SPAM_FILES_DETECTED = Mail(
- 'spam_files_detected',
- subject='[auto] Spam files audit'
-)
-
-# Retraction related Mail objects
-PENDING_RETRACTION_ADMIN = Mail(
- 'pending_retraction_admin',
- subject='Withdrawal pending for one of your registrations.'
-)
-PENDING_RETRACTION_NON_ADMIN = Mail(
- 'pending_retraction_non_admin',
- subject='Withdrawal pending for one of your registrations.'
-)
-# Embargo related Mail objects
-PENDING_EMBARGO_ADMIN = Mail(
- 'pending_embargo_admin',
- subject='Admin decision pending for one of your registrations.'
-)
-PENDING_EMBARGO_NON_ADMIN = Mail(
- 'pending_embargo_non_admin',
- subject='Admin decision pending for one of your registrations.'
-)
-# Registration related Mail Objects
-PENDING_REGISTRATION_ADMIN = Mail(
- 'pending_registration_admin',
- subject='Admin decision pending for one of your registrations.'
-)
-PENDING_REGISTRATION_NON_ADMIN = Mail(
- 'pending_registration_non_admin',
- subject='Admin decision pending for one of your registrations.'
-)
-PENDING_EMBARGO_TERMINATION_ADMIN = Mail(
- 'pending_embargo_termination_admin',
- subject='Request to end an embargo early for one of your registrations.'
-)
-PENDING_EMBARGO_TERMINATION_NON_ADMIN = Mail(
- 'pending_embargo_termination_non_admin',
- subject='Request to end an embargo early for one of your projects.'
-)
-
-FILE_OPERATION_SUCCESS = Mail(
- 'file_operation_success',
- subject='Your ${action} has finished',
-)
-FILE_OPERATION_FAILED = Mail(
- 'file_operation_failed',
- subject='Your ${action} has failed',
-)
-
-UNESCAPE = '<% from osf.utils.sanitize import unescape_entities %> ${unescape_entities(src.title)}'
-PROBLEM_REGISTERING = 'Problem registering ' + UNESCAPE
-ARCHIVE_FILE_NOT_FOUND_DESK = Mail(
- 'archive_file_not_found_desk',
- subject=PROBLEM_REGISTERING
-)
-ARCHIVE_FILE_NOT_FOUND_USER = Mail(
- 'archive_file_not_found_user',
- subject='Registration failed because of altered files'
-)
-
-ARCHIVE_UNCAUGHT_ERROR_DESK = Mail(
- 'archive_uncaught_error_desk',
- subject=PROBLEM_REGISTERING
-)
-
-ARCHIVE_REGISTRATION_STUCK_DESK = Mail(
- 'archive_registration_stuck_desk',
- subject='[auto] Stuck registrations audit'
-)
-
-ARCHIVE_SUCCESS = Mail(
- 'archive_success',
- subject='Registration of ' + UNESCAPE + ' complete'
-)
-
-DUPLICATE_ACCOUNTS_OSF4I = Mail(
- 'duplicate_accounts_sso_osf4i',
- subject='Duplicate OSF Accounts'
-)
-
-ADD_SSO_EMAIL_OSF4I = Mail(
- 'add_sso_email_osf4i',
- subject='Your OSF Account Email Address'
-)
-
-EMPTY = Mail('empty', subject='${subject}')
-
-REVIEWS_SUBMISSION_CONFIRMATION = Mail(
- 'reviews_submission_confirmation',
- subject='Confirmation of your submission to ${provider_name}'
-)
-
-REVIEWS_RESUBMISSION_CONFIRMATION = Mail(
- 'reviews_resubmission_confirmation',
- subject='Confirmation of your submission to ${provider_name}'
-)
-
-CROSSREF_ERROR = Mail(
- 'crossref_doi_error',
- subject='There was an error creating a DOI for preprint(s). batch_id: ${batch_id}'
-)
-
-CROSSREF_DOIS_PENDING = Mail(
- 'crossref_doi_pending',
- subject='There are ${pending_doi_count} preprints with crossref DOI pending.'
-)
-
-WITHDRAWAL_REQUEST_GRANTED = Mail(
- 'withdrawal_request_granted',
- subject='Your ${document_type} has been withdrawn',
-)
-
-WITHDRAWAL_REQUEST_DECLINED = Mail(
- 'withdrawal_request_declined',
- subject='Your withdrawal request has been declined',
-)
-
-TOU_NOTIF = Mail(
- 'tou_notif',
- subject='Updated Terms of Use for COS Websites and Services',
-)
-
-REGISTRATION_BULK_UPLOAD_PRODUCT_OWNER = Mail(
- 'registration_bulk_upload_product_owner',
- subject='Registry Could Not Bulk Upload Registrations'
-)
-
-REGISTRATION_BULK_UPLOAD_SUCCESS_ALL = Mail(
- 'registration_bulk_upload_success_all',
- subject='Registrations Successfully Bulk Uploaded to your Community\'s Registry'
-)
-
-REGISTRATION_BULK_UPLOAD_SUCCESS_PARTIAL = Mail(
- 'registration_bulk_upload_success_partial',
- subject='Some Registrations Successfully Bulk Uploaded to your Community\'s Registry'
-)
-
-
-REGISTRATION_BULK_UPLOAD_FAILURE_DUPLICATES = Mail(
- 'registration_bulk_upload_failure_duplicates',
- subject='Registrations Were Not Bulk Uploaded to your Community\'s Registry'
-)
-
-REGISTRATION_BULK_UPLOAD_UNEXPECTED_FAILURE = Mail(
- 'registration_bulk_upload_unexpected_failure',
- subject='Registrations Were Not Bulk Uploaded to your Community\'s Registry'
-)
-
-SCHEMA_RESPONSE_INITIATED = Mail(
- 'updates_initiated',
- subject='Updates for ${resource_type} ${title} are in progress'
-)
-
-
-SCHEMA_RESPONSE_SUBMITTED = Mail(
- 'updates_pending_approval',
- subject='Updates for ${resource_type} ${title} are pending Admin approval'
-)
-
-
-SCHEMA_RESPONSE_APPROVED = Mail(
- 'updates_approved',
- subject='The updates for ${resource_type} ${title} have been approved'
-)
-
-
-SCHEMA_RESPONSE_REJECTED = Mail(
- 'updates_rejected',
- subject='The updates for ${resource_type} ${title} were not accepted'
-)
-
-ADDONS_BOA_JOB_COMPLETE = Mail(
- 'addons_boa_job_complete',
- subject='Your Boa job has completed'
-)
-
-ADDONS_BOA_JOB_FAILURE = Mail(
- 'addons_boa_job_failure',
- subject='Your Boa job has failed'
-)
-
-NODE_REQUEST_INSTITUTIONAL_ACCESS_REQUEST = Mail(
- 'node_request_institutional_access_request',
- subject='Institutional Access Project Request'
-)
-
-USER_MESSAGE_INSTITUTIONAL_ACCESS_REQUEST = Mail(
- 'user_message_institutional_access_request',
- subject='Message from Institutional Admin'
-)
-
-PROJECT_AFFILIATION_CHANGED = Mail(
- 'project_affiliation_changed',
- subject='Project Affiliation Changed'
-)
diff --git a/website/settings/local-dist.py b/website/settings/local-dist.py
index 4124d621450..c421be3759e 100644
--- a/website/settings/local-dist.py
+++ b/website/settings/local-dist.py
@@ -144,9 +144,6 @@ class CeleryConfig(defaults.CeleryConfig):
CHRONOS_USE_FAKE_FILE = True
CHRONOS_FAKE_FILE_URL = 'https://staging2.osf.io/r2t5v/download'
-# Show sent emails in console
-logging.getLogger('website.mails.mails').setLevel(logging.DEBUG)
-
SHARE_ENABLED = False
DATACITE_ENABLED = False
From dbf481556686d8e32f8dbbc8a23b028190944c6b Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 23:22:44 -0400
Subject: [PATCH 152/336] fixing mailhog tests to use Django mail BE
---
osf/email/__init__.py | 77 ++++++++++++----------------------
osf/models/notification.py | 8 ++--
website/settings/defaults.py | 1 -
website/settings/local-ci.py | 3 +-
website/settings/local-dist.py | 3 +-
5 files changed, 36 insertions(+), 56 deletions(-)
diff --git a/osf/email/__init__.py b/osf/email/__init__.py
index 9ac0a16e0b4..753c6087a48 100644
--- a/osf/email/__init__.py
+++ b/osf/email/__init__.py
@@ -1,6 +1,4 @@
import logging
-import smtplib
-from email.mime.text import MIMEText
import waffle
from sendgrid import SendGridAPIClient, Personalization, To, Cc, Category, ReplyTo, Bcc
@@ -11,7 +9,7 @@
from django.core.mail import EmailMessage, get_connection
-def send_email_over_smtp(to_addr, notification_type, context, email_context):
+def send_email_over_smtp(to_email, notification_type, context, email_context):
"""Send an email notification using SMTP. This is typically not used in productions as other 3rd party mail services
are preferred. This is to be used for tests and on staging environments and special situations.
@@ -23,36 +21,38 @@ def send_email_over_smtp(to_addr, notification_type, context, email_context):
"""
if not settings.MAIL_SERVER:
raise NotImplementedError('MAIL_SERVER is not set')
- if not settings.MAIL_USERNAME and settings.MAIL_PASSWORD:
- raise NotImplementedError('MAIL_USERNAME and MAIL_PASSWORD are required for STMP')
if waffle.switch_is_active(features.ENABLE_MAILHOG):
- send_to_mailhog(
- subject=notification_type.subject,
- message=notification_type.template.format(**context),
- to_email=to_addr,
- from_email=settings.MAIL_USERNAME,
- )
- return
+ host = settings.MAILHOG_HOST
+ port = settings.MAILHOG_PORT
+ else:
+ host = settings.MAIL_SERVER
+ port = settings.MAIL_PORT
- msg = MIMEText(
- notification_type.template.format(**context),
- 'html',
- _charset='utf-8'
+ email = EmailMessage(
+ subject=notification_type.subject.format(**context),
+ body=notification_type.template.format(**context),
+ from_email=settings.OSF_SUPPORT_EMAIL,
+ to=[to_email],
+ connection=get_connection(
+ backend='django.core.mail.backends.smtp.EmailBackend',
+ host=host,
+ port=port,
+ username=settings.MAIL_USERNAME,
+ password=settings.MAIL_PASSWORD,
+ use_tls=False,
+ use_ssl=False,
+ )
)
+ email.content_subtype = 'html'
- if notification_type.subject:
- msg['Subject'] = notification_type.subject.format(**context)
-
- with smtplib.SMTP(settings.MAIL_SERVER) as server:
- server.ehlo()
- server.login(settings.MAIL_USERNAME, settings.MAIL_PASSWORD)
- server.sendmail(
- settings.FROM_EMAIL,
- [to_addr],
- msg.as_string()
- )
+ if email_context:
+ attachment_name = email_context.get('attachment_name', None)
+ attachment_content = email_context.get('attachment_content', None)
+ if attachment_name and attachment_content:
+ email.attach(attachment_name, attachment_content)
+ email.send()
def send_email_with_send_grid(to_addr, notification_type, context, email_context):
"""Send an email notification using SendGrid.
@@ -115,26 +115,3 @@ def send_email_with_send_grid(to_addr, notification_type, context, email_context
except Exception as exc:
logging.error(f'Failed to send email notification to {to_addr}: {exc}')
raise exc
-
-def send_to_mailhog(subject, message, from_email, to_email, attachment_name=None, attachment_content=None):
- email = EmailMessage(
- subject=subject,
- body=message,
- from_email=from_email,
- to=[to_email],
- connection=get_connection(
- backend='django.core.mail.backends.smtp.EmailBackend',
- host=settings.MAILHOG_HOST,
- port=settings.MAILHOG_PORT,
- username='',
- password='',
- use_tls=False,
- use_ssl=False,
- )
- )
- email.content_subtype = 'html'
-
- if attachment_name and attachment_content:
- email.attach(attachment_name, attachment_content)
-
- email.send()
diff --git a/osf/models/notification.py b/osf/models/notification.py
index 228ee3e9d5a..e0775b192d3 100644
--- a/osf/models/notification.py
+++ b/osf/models/notification.py
@@ -1,11 +1,13 @@
import logging
+import waffle
from django.db import models
from django.utils import timezone
from website import settings
from api.base import settings as api_settings
-from osf import email
+from osf import email, features
+
class Notification(models.Model):
subscription = models.ForeignKey(
@@ -31,7 +33,7 @@ def send(
raise NotImplementedError(f'Protocol type {protocol_type}. Email notifications are only implemented.')
recipient_address = destination_address or self.subscription.user.username
- if protocol_type == 'email' and settings.ENABLE_TEST_EMAIL:
+ if protocol_type == 'email' and waffle.switch_is_active(features.ENABLE_MAILHOG):
email.send_email_over_smtp(
recipient_address,
self.subscription.notification_type,
@@ -41,7 +43,7 @@ def send(
elif protocol_type == 'email' and settings.DEV_MODE:
if not api_settings.CI_ENV:
logging.info(
- f"Attempting to send email in DEV_MODE with ENABLE_TEST_EMAIL false just logs:"
+ f"Attempting to send email in DEV_MODE for just mocked logs:"
f"\nto={recipient_address}"
f"\ntype={self.subscription.notification_type.name}"
f"\ncontext={self.event_context}"
diff --git a/website/settings/defaults.py b/website/settings/defaults.py
index a68414b6763..7581393e1db 100644
--- a/website/settings/defaults.py
+++ b/website/settings/defaults.py
@@ -143,7 +143,6 @@ def parent_dir(path):
USE_CDN_FOR_CLIENT_LIBS = True
FROM_EMAIL = 'openscienceframework-noreply@osf.io'
-ENABLE_TEST_EMAIL = False
# support email
OSF_SUPPORT_EMAIL = 'support@osf.io'
# contact email
diff --git a/website/settings/local-ci.py b/website/settings/local-ci.py
index 2cab1ca4252..83049429921 100644
--- a/website/settings/local-ci.py
+++ b/website/settings/local-ci.py
@@ -47,9 +47,10 @@
USE_CELERY = False
# Email
-MAIL_SERVER = 'localhost:1025' # For local testing
MAIL_USERNAME = 'osf-smtp'
MAIL_PASSWORD = 'CHANGEME'
+MAIL_SERVER = 'localhost' # For local testing
+MAIL_PORT = 1025 # For local testing
MAILHOG_HOST = 'localhost'
MAILHOG_PORT = 1025
diff --git a/website/settings/local-dist.py b/website/settings/local-dist.py
index c421be3759e..cb5e72f9b2e 100644
--- a/website/settings/local-dist.py
+++ b/website/settings/local-dist.py
@@ -57,9 +57,10 @@
ELASTIC_TIMEOUT = 10
# Email
-MAIL_SERVER = 'localhost:1025' # For local testing
MAIL_USERNAME = 'osf-smtp'
MAIL_PASSWORD = 'CHANGEME'
+MAIL_SERVER = 'localhost' # For local testing
+MAIL_PORT = 'localhost' # For local testing
MAILHOG_HOST = 'mailhog'
MAILHOG_PORT = 1025
From ebffbe411e3a7d660763dc4eae1630415bd75fe7 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 23:26:06 -0400
Subject: [PATCH 153/336] more mails removal
---
addons/boa/tests/test_tasks.py | 76 ++++-----
.../commands/check_crossref_dois.py | 161 ------------------
osf/management/commands/email_all_users.py | 116 -------------
osf/management/commands/find_spammy_files.py | 114 -------------
osf_tests/test_archiver.py | 2 -
scripts/osfstorage/usage_audit.py | 1 -
scripts/stuck_registration_audit.py | 6 +-
tests/test_auth.py | 1 -
tests/test_auth_views.py | 2 +-
9 files changed, 35 insertions(+), 444 deletions(-)
delete mode 100644 osf/management/commands/check_crossref_dois.py
delete mode 100644 osf/management/commands/email_all_users.py
delete mode 100644 osf/management/commands/find_spammy_files.py
diff --git a/addons/boa/tests/test_tasks.py b/addons/boa/tests/test_tasks.py
index f31185fa789..1580205048e 100644
--- a/addons/boa/tests/test_tasks.py
+++ b/addons/boa/tests/test_tasks.py
@@ -8,7 +8,7 @@
from addons.boa import settings as boa_settings
from addons.boa.boa_error_code import BoaErrorCode
-from addons.boa.tasks import submit_to_boa, submit_to_boa_async, handle_boa_error
+from addons.boa.tasks import submit_to_boa, handle_boa_error
from osf.models import NotificationType
from osf_tests.factories import AuthUserFactory, ProjectFactory
from tests.base import OsfTestCase
@@ -50,7 +50,6 @@ def test_boa_error_code(self):
assert BoaErrorCode.FILE_TOO_LARGE_ERROR == 6
assert BoaErrorCode.JOB_TIME_OUT_ERROR == 7
- @mock.patch('website.mails.settings.USE_CELERY', False)
def test_handle_boa_error(self):
with mock.patch('addons.boa.tasks.sentry.log_message', return_value=None) as mock_sentry_log_message, \
mock.patch('addons.boa.tasks.logger.error', return_value=None) as mock_logger_error:
@@ -59,16 +58,15 @@ def test_handle_boa_error(self):
self.error_message,
BoaErrorCode.UNKNOWN,
self.user_username,
- self.user_fullname,
self.project_url,
self.file_full_path,
- query_file_name=self.query_file_name,
- file_size=self.file_size,
- output_file_name=self.output_file_name,
- job_id=self.job_id
+ self.query_file_name,
+ self.file_size,
+ self.output_file_name,
+ self.job_id
)
assert len(notifications) == 1
- assert notifications[0]['typr'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
+ assert notifications[0]['type'] == NotificationType.Type.ADDONS_BOA_JOB_FAILURE
mock_sentry_log_message.assert_called_with(self.error_message, skip_session=True)
mock_logger_error.assert_called_with(self.error_message)
assert return_value == BoaErrorCode.UNKNOWN
@@ -90,30 +88,21 @@ def setUp(self):
self.query_download_url = f'http://localhost:7777/v1/resources/{self.project_guid}/providers/osfstorage/1a2b3c4d'
self.output_upload_url = f'http://localhost:7777/v1/resources/{self.project_guid}/providers/osfstorage/?kind=file'
- def tearDown(self):
- super().tearDown()
-
def test_submit_to_boa_async_called(self):
- with mock.patch(
- 'addons.boa.tasks.submit_to_boa_async',
- new_callable=AsyncMock,
- return_value=BoaErrorCode.NO_ERROR
- ) as mock_submit_to_boa_async:
- return_value = submit_to_boa(
- self.host,
- self.username,
- self.password,
- self.user_guid,
- self.project_guid,
- self.query_dataset,
- self.query_file_name,
- self.file_size,
- self.file_full_path,
- self.query_download_url,
- self.output_upload_url
- )
- assert return_value == BoaErrorCode.NO_ERROR
- mock_submit_to_boa_async.assert_called()
+ return_value = submit_to_boa(
+ self.host,
+ self.username,
+ self.password,
+ self.user_guid,
+ self.project_guid,
+ self.query_dataset,
+ self.query_file_name,
+ self.file_size,
+ self.file_full_path,
+ self.query_download_url,
+ self.output_upload_url
+ )
+ assert return_value == BoaErrorCode.NO_ERROR
@pytest.mark.django_db
@@ -150,7 +139,6 @@ def setUp(self):
boa_settings.REFRESH_JOB_INTERVAL = DEFAULT_REFRESH_JOB_INTERVAL
boa_settings.MAX_JOB_WAITING_TIME = DEFAULT_MAX_JOB_WAITING_TIME
- @mock.patch('website.mails.settings.USE_CELERY', False)
async def test_submit_success(self):
with mock.patch('osf.models.user.OSFUser.objects.get', return_value=self.user), \
mock.patch('osf.models.user.OSFUser.get_or_create_cookie', return_value=self.user_cookie), \
@@ -162,7 +150,7 @@ async def test_submit_success(self):
mock.patch('asyncio.sleep', new_callable=AsyncMock, return_value=None) as mock_async_sleep, \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
with capture_notifications() as notifications:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -190,7 +178,7 @@ async def test_download_error(self):
mock.patch('osf.models.user.OSFUser.get_or_create_cookie', return_value=self.user_cookie), \
mock.patch('urllib.request.urlopen', side_effect=http_404), \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -221,7 +209,7 @@ async def test_login_error(self):
mock.patch('boaapi.boa_client.BoaClient.login', side_effect=BoaException()) as mock_login, \
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -255,7 +243,7 @@ async def test_data_set_error(self):
mock.patch('boaapi.boa_client.BoaClient.get_dataset', side_effect=BoaException()) as mock_get_dataset, \
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -290,7 +278,7 @@ async def test_submit_error(self):
mock.patch('boaapi.boa_client.BoaClient.query', side_effect=BoaException()) as mock_query, \
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -328,7 +316,7 @@ async def test_compile_error(self):
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('asyncio.sleep', new_callable=AsyncMock, return_value=None), \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -366,7 +354,7 @@ async def test_execute_error(self):
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('asyncio.sleep', new_callable=AsyncMock, return_value=None), \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -403,7 +391,7 @@ async def test_output_error_(self):
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('asyncio.sleep', new_callable=AsyncMock, return_value=None), \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -441,7 +429,7 @@ async def test_upload_error_conflict(self):
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('asyncio.sleep', new_callable=AsyncMock, return_value=None), \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -479,7 +467,7 @@ async def test_upload_error_other(self):
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('asyncio.sleep', new_callable=AsyncMock, return_value=None), \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -510,7 +498,7 @@ async def test_file_too_large_error(self):
with mock.patch('osf.models.user.OSFUser.objects.get', return_value=self.user), \
mock.patch('osf.models.user.OSFUser.get_or_create_cookie', return_value=self.user_cookie), \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
@@ -546,7 +534,7 @@ async def test_job_timeout_error(self):
mock.patch('boaapi.boa_client.BoaClient.query', return_value=self.mock_job), \
mock.patch('boaapi.boa_client.BoaClient.close', return_value=None) as mock_close, \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
- return_value = await submit_to_boa_async(
+ return_value = submit_to_boa(
self.host,
self.username,
self.password,
diff --git a/osf/management/commands/check_crossref_dois.py b/osf/management/commands/check_crossref_dois.py
deleted file mode 100644
index bff7ca7e07f..00000000000
--- a/osf/management/commands/check_crossref_dois.py
+++ /dev/null
@@ -1,161 +0,0 @@
-from datetime import timedelta
-import logging
-import requests
-
-import django
-from django.core.mail import send_mail
-from django.core.management.base import BaseCommand
-from django.utils import timezone
-django.setup()
-
-from framework import sentry
-from framework.celery_tasks import app as celery_app
-from osf.models import Guid, Preprint
-from website import mails, settings
-
-
-logger = logging.getLogger(__name__)
-logging.basicConfig(level=logging.INFO)
-
-time_since_published = timedelta(days=settings.DAYS_CROSSREF_DOIS_MUST_BE_STUCK_BEFORE_EMAIL)
-
-CHECK_DOIS_BATCH_SIZE = 20
-
-
-def pop_slice(lis, n):
- tem = lis[:n]
- del lis[:n]
- return tem
-
-def mint_doi_for_preprints_locally(confirm_local=False):
- """This method creates identifiers for preprints which have pending DOI in local environment only.
- """
- if not settings.DEV_MODE or not settings.DEBUG_MODE:
- logger.error('This command should only run in the local development environment.')
- return
- if not confirm_local:
- logger.error('You must explicitly set `confirm_local` to run this command.')
- return
-
- preprints_with_pending_doi = Preprint.objects.filter(preprint_doi_created__isnull=True, is_published=True)
- total_created = 0
- for preprint in preprints_with_pending_doi:
- client = preprint.get_doi_client()
- doi = client.build_doi(preprint=preprint) if client else None
- if doi:
- logger.info(f'Minting DOI [{doi}] for Preprint [{preprint._id}].')
- preprint.set_identifier_values(doi, save=True)
- total_created += 1
- logger.info(f'[{total_created}] DOIs minted.')
-
-def check_crossref_dois(dry_run=True):
- """
- This script is to check for any DOI confirmation messages we may have missed during downtime and alert admins to any
- DOIs that have been pending for X number of days. It creates url to check with crossref if all our pending crossref
- DOIs are minted, then sets all identifiers which are confirmed minted.
-
- :param dry_run:
- :return:
- """
-
- preprints_with_pending_dois = Preprint.objects.filter(
- preprint_doi_created__isnull=True,
- is_published=True
- ).exclude(date_published__gt=timezone.now() - time_since_published)
-
- if not preprints_with_pending_dois.exists():
- return
-
- preprints = list(preprints_with_pending_dois)
-
- while preprints:
- preprint_batch = pop_slice(preprints, CHECK_DOIS_BATCH_SIZE)
-
- pending_dois = []
- for preprint in preprint_batch:
- doi_prefix = preprint.provider.doi_prefix
- if not doi_prefix:
- sentry.log_message(f'Preprint [_id={preprint._id}] has been skipped for CrossRef DOI Check '
- f'since the provider [_id={preprint.provider._id}] has invalid DOI Prefix '
- f'[doi_prefix={doi_prefix}]')
- continue
- pending_dois.append(f'doi:{settings.DOI_FORMAT.format(prefix=doi_prefix, guid=preprint._id)}')
-
- if not pending_dois:
- continue
-
- url = '{}works?filter={}'.format(settings.CROSSREF_JSON_API_URL, ','.join(pending_dois))
-
- try:
- resp = requests.get(url)
- resp.raise_for_status()
- except requests.exceptions.HTTPError as exc:
- sentry.log_message(f'Could not contact crossref to check for DOIs, response returned with exception {exc}')
- continue
-
- preprints_response = resp.json()['message']['items']
-
- for preprint in preprints_response:
- preprint__id = preprint['DOI'].split('/')[-1]
- base_guid, version = Guid.split_guid(preprint__id)
- if not base_guid or not version:
- sentry.log_message(f'[Skipped] Preprint [_id={preprint__id}] returned by CrossRef API has invalid _id')
- continue
- pending_preprint = preprints_with_pending_dois.filter(
- versioned_guids__guid___id=base_guid,
- versioned_guids__version=version,
- ).first()
- if not pending_preprint:
- sentry.log_message(f'[Skipped] Preprint [_id={preprint__id}] returned by CrossRef API is not found.')
- continue
- if not dry_run:
- logger.debug(f'Set identifier for {pending_preprint._id}')
- pending_preprint.set_identifier_values(preprint['DOI'], save=True)
- else:
- logger.info(f'DRY RUN: Set identifier for {pending_preprint._id}')
-
-
-def report_stuck_dois(dry_run=True):
-
- preprints_with_pending_dois = Preprint.objects.filter(preprint_doi_created__isnull=True,
- is_published=True,
- date_published__lt=timezone.now() - time_since_published)
-
- if preprints_with_pending_dois:
- guids = ', '.join(preprints_with_pending_dois.values_list('guids___id', flat=True))
- if not dry_run:
- send_mail(
- to_addr=settings.OSF_SUPPORT_EMAIL,
- mail=mails.CROSSREF_DOIS_PENDING,
- pending_doi_count=preprints_with_pending_dois.count(),
- time_since_published=time_since_published.days,
- guids=guids,
- )
- else:
- logger.info('DRY RUN')
-
- logger.info(f'There were {preprints_with_pending_dois.count()} stuck registrations for CrossRef, email sent to help desk')
-
-
-@celery_app.task(name='management.commands.check_crossref_dois')
-def main(dry_run=False):
- check_crossref_dois(dry_run=dry_run)
- report_stuck_dois(dry_run=dry_run)
-
-
-class Command(BaseCommand):
- help = '''Checks if we've missed any Crossref DOI confirmation emails. '''
-
- def add_arguments(self, parser):
- super().add_arguments(parser)
- parser.add_argument(
- '--dry',
- action='store_true',
- dest='dry_run',
- help='Dry run',
- )
-
- # Management command handler
- def handle(self, *args, **options):
- dry_run = options.get('dry_run', True)
- main(dry_run=dry_run)
diff --git a/osf/management/commands/email_all_users.py b/osf/management/commands/email_all_users.py
deleted file mode 100644
index 774f8b5af2d..00000000000
--- a/osf/management/commands/email_all_users.py
+++ /dev/null
@@ -1,116 +0,0 @@
-# This is a management command, rather than a migration script, for two primary reasons:
-# 1. It makes no changes to database structure (e.g. AlterField), only database content.
-# 2. It takes a long time to run and the site doesn't need to be down that long.
-
-import logging
-
-
-import django
-from django.core.mail import send_mail
-
-django.setup()
-
-from django.core.management.base import BaseCommand
-from framework import sentry
-
-from website import mails
-
-from osf.models import OSFUser
-
-logger = logging.getLogger(__name__)
-
-OFFSET = 500000
-
-def email_all_users(email_template, dry_run=False, ids=None, start_id=0, offset=OFFSET):
-
- if ids:
- active_users = OSFUser.objects.filter(id__in=ids)
- else:
- lower_bound = start_id
- upper_bound = start_id + offset
- base_query = OSFUser.objects.filter(date_confirmed__isnull=False, deleted=None).exclude(date_disabled__isnull=False).exclude(is_active=False)
- active_users = base_query.filter(id__gt=lower_bound, id__lte=upper_bound).order_by('id')
-
- if dry_run:
- active_users = active_users.exclude(is_superuser=False)
-
- total_active_users = active_users.count()
-
- logger.info(f'About to send an email to {total_active_users} users.')
-
- template = getattr(mails, email_template, None)
- if not template:
- raise RuntimeError('Invalid email template specified!')
-
- total_sent = 0
- for user in active_users.iterator():
- logger.info(f'Sending email to {user.id}')
- try:
- send_mail(
- to_addr=user.email,
- mail=template,
- given_name=user.given_name or user.fullname,
- )
- except Exception as e:
- logger.error(f'Exception encountered sending email to {user.id}')
- sentry.log_exception(e)
- continue
- else:
- total_sent += 1
-
- logger.info(f'Emails sent to {total_sent}/{total_active_users} users')
-
-
-class Command(BaseCommand):
- """
- Add subscription to all active users for given notification type.
- """
- def add_arguments(self, parser):
- super().add_arguments(parser)
- parser.add_argument(
- '--dry',
- action='store_true',
- dest='dry_run',
- help='Test - Only send to superusers'
- )
-
- parser.add_argument(
- '--t',
- type=str,
- dest='template',
- required=True,
- help='Specify which template to use'
- )
-
- parser.add_argument(
- '--start-id',
- type=int,
- dest='start_id',
- default=0,
- help='Specify id to start from.'
- )
-
- parser.add_argument(
- '--ids',
- dest='ids',
- nargs='+',
- help='Specific IDs to email, otherwise will email all users'
- )
-
- parser.add_argument(
- '--o',
- type=int,
- dest='offset',
- default=OFFSET,
- help=f'How many users to email in this run, default is {OFFSET}'
- )
-
- def handle(self, *args, **options):
- dry_run = options.get('dry_run', False)
- template = options.get('template')
- start_id = options.get('start_id')
- ids = options.get('ids')
- offset = options.get('offset', OFFSET)
- email_all_users(template, dry_run, start_id=start_id, ids=ids, offset=offset)
- if dry_run:
- raise RuntimeError('Dry run, only superusers emailed')
diff --git a/osf/management/commands/find_spammy_files.py b/osf/management/commands/find_spammy_files.py
deleted file mode 100644
index 7feeab508fa..00000000000
--- a/osf/management/commands/find_spammy_files.py
+++ /dev/null
@@ -1,114 +0,0 @@
-import io
-import csv
-from datetime import timedelta
-import logging
-
-from django.core.mail import send_mail
-from django.core.management.base import BaseCommand
-from django.utils import timezone
-
-from addons.osfstorage.models import OsfStorageFile
-from framework.celery_tasks import app
-from website import mails
-
-logger = logging.getLogger(__name__)
-
-
-@app.task(name='osf.management.commands.find_spammy_files')
-def find_spammy_files(sniff_r=None, n=None, t=None, to_addrs=None):
- if not sniff_r:
- raise RuntimeError('Require arg sniff_r not found')
- if isinstance(sniff_r, str):
- sniff_r = [sniff_r]
- if isinstance(to_addrs, str):
- to_addrs = [to_addrs]
- for sniff in sniff_r:
- filename = f'spam_files_{sniff}.csv'
- filepath = f'/tmp/{filename}'
- fieldnames = ['f.name', 'f._id', 'f.created', 'n._id', 'u._id', 'u.username', 'u.fullname']
- output = io.StringIO()
- writer = csv.DictWriter(output, fieldnames)
- writer.writeheader()
- qs = OsfStorageFile.objects.filter(name__iregex=sniff)
- if t:
- qs = qs.filter(created__gte=timezone.now() - timedelta(days=t))
- if n:
- qs = qs[:n]
- ct = 0
- for f in qs:
- node = f.target
- user = getattr(f.versions.first(), 'creator', node.creator)
- if f.target.deleted or user.is_disabled:
- continue
- ct += 1
- writer.writerow({
- 'f.name': f.name,
- 'f._id': f._id,
- 'f.created': f.created,
- 'n._id': node._id,
- 'u._id': user._id,
- 'u.username': user.username,
- 'u.fullname': user.fullname
- })
- if ct:
- if to_addrs:
- for addr in to_addrs:
- send_mail(
- mail=mails.SPAM_FILES_DETECTED,
- to_addr=addr,
- ct=ct,
- sniff_r=sniff,
- attachment_name=filename,
- attachment_content=output.getvalue(),
- can_change_preferences=False,
- )
- else:
- with open(filepath, 'w') as writeFile:
- writeFile.write(output.getvalue())
-
-class Command(BaseCommand):
- help = '''Script to match filenames to common spammy names.'''
-
- def add_arguments(self, parser):
- parser.add_argument(
- '--sniff_r',
- type=str,
- nargs='+',
- required=True,
- help='Regex to match against file.name',
- )
- parser.add_argument(
- '--n',
- type=int,
- default=None,
- help='Max number of files to return',
- )
- parser.add_argument(
- '--t',
- type=int,
- default=None,
- help='Number of days to search through',
- )
- parser.add_argument(
- '--to_addrs',
- type=str,
- nargs='*',
- default=None,
- help='Email address(es) to send the resulting file to. If absent, write to csv in /tmp/',
- )
-
- def handle(self, *args, **options):
- script_start_time = timezone.now()
- logger.info(f'Script started time: {script_start_time}')
- logger.debug(options)
-
- sniff_r = options.get('sniff_r')
- n = options.get('n', None)
- t = options.get('t', None)
- to_addrs = options.get('to_addrs', None)
-
- find_spammy_files(sniff_r=sniff_r, n=n, t=t, to_addrs=to_addrs)
-
- script_finish_time = timezone.now()
- logger.info(f'Script finished time: {script_finish_time}')
- logger.info(f'Run time {script_finish_time - script_start_time}')
diff --git a/osf_tests/test_archiver.py b/osf_tests/test_archiver.py
index bc5efc2c3f9..34394e9a39c 100644
--- a/osf_tests/test_archiver.py
+++ b/osf_tests/test_archiver.py
@@ -12,8 +12,6 @@
from framework.auth import Auth
from framework.celery_tasks import handlers
-from website import mails
-
from website.archiver import (
ARCHIVER_INITIATED,
)
diff --git a/scripts/osfstorage/usage_audit.py b/scripts/osfstorage/usage_audit.py
index c50e3f57640..200f3fda0e7 100644
--- a/scripts/osfstorage/usage_audit.py
+++ b/scripts/osfstorage/usage_audit.py
@@ -21,7 +21,6 @@
from framework.celery_tasks import app as celery_app
from osf.models import TrashedFile, Node
-from website import mails
from website.app import init_app
from website.settings.defaults import GBs
diff --git a/scripts/stuck_registration_audit.py b/scripts/stuck_registration_audit.py
index c9bce059fb9..d165b256a61 100644
--- a/scripts/stuck_registration_audit.py
+++ b/scripts/stuck_registration_audit.py
@@ -9,15 +9,13 @@
from django.utils import timezone
-from website import mails
from website import settings
from framework.auth import Auth
from framework.celery_tasks import app as celery_app
from osf.management.commands import force_archive as fa
-from osf.models import ArchiveJob, Registration, NotificationType
-from website.archiver import ARCHIVER_INITIATED
-from website.settings import ARCHIVE_TIMEOUT_TIMEDELTA, ADDONS_REQUESTED
+from osf.models import Registration, NotificationType
+from website.settings import ADDONS_REQUESTED
from scripts import utils as scripts_utils
diff --git a/tests/test_auth.py b/tests/test_auth.py
index 05f6d243e33..1b7ec29df1e 100644
--- a/tests/test_auth.py
+++ b/tests/test_auth.py
@@ -27,7 +27,6 @@
from osf.models import OSFUser, NotificationType
from osf.utils import permissions
from tests.utils import capture_notifications
-from website import mails
from website import settings
from website.project.decorators import (
must_have_permission,
diff --git a/tests/test_auth_views.py b/tests/test_auth_views.py
index ca4476d17d3..7c6e282a07f 100644
--- a/tests/test_auth_views.py
+++ b/tests/test_auth_views.py
@@ -38,7 +38,7 @@
fake,
OsfTestCase,
)
-from website import mails, settings
+from website import settings
from website.util import api_url_for, web_url_for
pytestmark = pytest.mark.django_db
From 795a7e6500e9d11c57eeb6b2947daa7d02a549e9 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 23:36:21 -0400
Subject: [PATCH 154/336] fix send grid code
---
osf/email/__init__.py | 6 +++---
tests/test_preprints.py | 2 +-
tests/test_spam_mixin.py | 2 +-
3 files changed, 5 insertions(+), 5 deletions(-)
diff --git a/osf/email/__init__.py b/osf/email/__init__.py
index 753c6087a48..5b2dae93a04 100644
--- a/osf/email/__init__.py
+++ b/osf/email/__init__.py
@@ -85,19 +85,19 @@ def send_email_with_send_grid(to_addr, notification_type, context, email_context
personalization.add_to(To(to_addr))
- if cc_addr := email_context.get('cc_addr'):
+ if cc_addr := email_context.get('cc_addr', None):
if isinstance(cc_addr, str):
cc_addr = [cc_addr]
for email in cc_addr:
personalization.add_cc(Cc(email))
- if bcc_addr := email_context.get('cc_addr'):
+ if bcc_addr := email_context.get('bcc_addr', None):
if isinstance(bcc_addr, str):
bcc_addr = [bcc_addr]
for email in bcc_addr:
personalization.add_bcc(Bcc(email))
- if reply_to := email_context.get('reply_to'):
+ if reply_to := email_context.get('reply_to', None):
message.reply_to = ReplyTo(reply_to)
message.add_personalization(personalization)
diff --git a/tests/test_preprints.py b/tests/test_preprints.py
index 724dda3b0ae..91ed769a3e7 100644
--- a/tests/test_preprints.py
+++ b/tests/test_preprints.py
@@ -44,7 +44,7 @@
from osf.utils.workflows import DefaultStates, RequestTypes, ReviewStates
from tests.base import assert_datetime_equal, OsfTestCase
from tests.utils import assert_preprint_logs, capture_notifications
-from website import settings, mails
+from website import settings
from website.identifiers.clients import CrossRefClient, ECSArXivCrossRefClient, crossref
from website.identifiers.utils import request_identifiers
from website.preprints.tasks import (
diff --git a/tests/test_spam_mixin.py b/tests/test_spam_mixin.py
index 59b04ec1fa9..2c4cba1c8d7 100644
--- a/tests/test_spam_mixin.py
+++ b/tests/test_spam_mixin.py
@@ -12,7 +12,7 @@
from osf_tests.factories import UserFactory, CommentFactory, ProjectFactory, PreprintFactory, RegistrationFactory, AuthUserFactory
from osf.models import NotableDomain, SpamStatus, NotificationType
from tests.utils import capture_notifications
-from website import settings, mails
+from website import settings
@pytest.mark.django_db
From a90a2340c1abbcedaa0adbfb144e776b90958215 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Wed, 30 Jul 2025 23:48:27 -0400
Subject: [PATCH 155/336] fix more sanctions code
---
notifications.yaml | 4 ++++
osf/utils/notifications.py | 5 +++--
website/reviews/listeners.py | 30 +++++++++++++++++++++++++++++-
3 files changed, 36 insertions(+), 3 deletions(-)
diff --git a/notifications.yaml b/notifications.yaml
index 1cc60553ac9..03c74a3bb03 100644
--- a/notifications.yaml
+++ b/notifications.yaml
@@ -305,6 +305,10 @@ notification_types:
__docs__: ...
object_content_type_model_name: abstractnode
template: 'website/templates/emails/pending_registration_admin.html.mako'
+ - name: node_pending_registration_non_admin
+ __docs__: ...
+ object_content_type_model_name: abstractnode
+ template: 'website/templates/emails/pending_registration_non_admin.html.mako'
- name: node_pending_retraction_admin
__docs__: ...
object_content_type_model_name: abstractnode
diff --git a/osf/utils/notifications.py b/osf/utils/notifications.py
index 8e432af12a5..76d6e255668 100644
--- a/osf/utils/notifications.py
+++ b/osf/utils/notifications.py
@@ -42,12 +42,13 @@ def notify_submit(resource, user, *args, **kwargs):
context=context,
recipients=recipients,
resource=resource,
+ notification_type=NotificationType.Type.PROVIDER_REVIEWS_SUBMISSION_CONFIRMATION
)
reviews_signals.reviews_email_submit_moderators_notifications.send(
timestamp=timezone.now(),
context=context,
resource=resource,
- user=user
+ user=user,
)
@@ -59,7 +60,7 @@ def notify_resubmit(resource, user, *args, **kwargs):
reviews_signals.reviews_email_submit.send(
recipients=recipients,
context=context,
- template=NotificationType.Type.PROVIDER_REVIEWS_RESUBMISSION_CONFIRMATION,
+ notification_type=NotificationType.Type.PROVIDER_REVIEWS_RESUBMISSION_CONFIRMATION,
resource=resource,
)
reviews_signals.reviews_email_submit_moderators_notifications.send(
diff --git a/website/reviews/listeners.py b/website/reviews/listeners.py
index 6fa873e53a9..be4b3ff7c82 100644
--- a/website/reviews/listeners.py
+++ b/website/reviews/listeners.py
@@ -1,4 +1,5 @@
-from website.settings import DOMAIN
+from osf.models import NotificationType
+from website.settings import DOMAIN, OSF_PREPRINTS_LOGO, OSF_REGISTRIES_LOGO
from website.reviews import signals as reviews_signals
@@ -54,3 +55,30 @@ def reviews_withdrawal_requests_notification(self, timestamp, context):
user=recipient,
event_context=context,
)
+
+
+@reviews_signals.reviews_email_submit.connect
+def reviews_submit_notification(self, recipients, context, resource, notification_type=None):
+ """
+ Handle email notifications for a new submission or a resubmission
+ """
+ provider = resource.provider
+ if provider._id == 'osf':
+ if provider.type == 'osf.preprintprovider':
+ context['logo'] = OSF_PREPRINTS_LOGO
+ elif provider.type == 'osf.registrationprovider':
+ context['logo'] = OSF_REGISTRIES_LOGO
+ else:
+ raise NotImplementedError()
+ else:
+ context['logo'] = resource.provider._id
+
+ for recipient in recipients:
+ context['is_creator'] = recipient == resource.creator
+ context['provider_name'] = resource.provider.name
+ NotificationType.objects.get(
+ name=notification_type
+ ).emit(
+ user=recipient,
+ event_context=context
+ )
From 8421403b7aa25a5c1979b5ea5b484c8512dcc503 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Thu, 31 Jul 2025 07:34:13 -0400
Subject: [PATCH 156/336] fix more schema responses
---
.../test_check_crossref_dois.py | 72 -------------------
.../test_email_all_users.py | 71 ------------------
osf_tests/test_schema_responses.py | 1 +
3 files changed, 1 insertion(+), 143 deletions(-)
delete mode 100644 osf_tests/management_commands/test_check_crossref_dois.py
delete mode 100644 osf_tests/management_commands/test_email_all_users.py
diff --git a/osf_tests/management_commands/test_check_crossref_dois.py b/osf_tests/management_commands/test_check_crossref_dois.py
deleted file mode 100644
index 802ce4fde0b..00000000000
--- a/osf_tests/management_commands/test_check_crossref_dois.py
+++ /dev/null
@@ -1,72 +0,0 @@
-import os
-from unittest import mock
-import pytest
-import json
-from datetime import timedelta
-import responses
-
-from osf.models import NotificationType
-from tests.utils import capture_notifications
-
-HERE = os.path.dirname(os.path.abspath(__file__))
-
-
-from osf_tests.factories import PreprintFactory
-from website import settings
-
-from osf.management.commands.check_crossref_dois import check_crossref_dois, report_stuck_dois
-
-
-@pytest.mark.django_db
-class TestCheckCrossrefDOIs:
-
- @pytest.fixture()
- def preprint(self):
- return PreprintFactory()
-
- @pytest.fixture()
- def stuck_preprint(self):
- preprint = PreprintFactory(set_doi=False, set_guid='guid0')
- preprint.date_published = preprint.date_published - timedelta(days=settings.DAYS_CROSSREF_DOIS_MUST_BE_STUCK_BEFORE_EMAIL + 1)
- # match guid to the fixture crossref_works_response.json
- guid = preprint.guids.first()
- provider = preprint.provider
- provider.doi_prefix = '10.31236'
- provider.save()
- guid._id = 'guid0'
- guid.save()
-
- preprint.save()
- return preprint
-
- @pytest.fixture()
- def crossref_response(self):
- with open(os.path.join(HERE, 'fixtures/crossref_works_response.json'), 'rb') as fp:
- return json.loads(fp.read())
-
- @responses.activate
- @mock.patch('osf.models.preprint.update_or_enqueue_on_preprint_updated', mock.Mock())
- def test_check_crossref_dois(self, crossref_response, stuck_preprint, preprint):
- doi = settings.DOI_FORMAT.format(prefix=stuck_preprint.provider.doi_prefix, guid=stuck_preprint._id)
- responses.add(
- responses.Response(
- responses.GET,
- url=f'{settings.CROSSREF_JSON_API_URL}works?filter=doi:{doi}',
- json=crossref_response,
- status=200
- )
- )
-
- check_crossref_dois(dry_run=False)
-
- assert preprint.identifiers.count() == 1
-
- assert stuck_preprint.identifiers.count() == 1
- assert stuck_preprint.identifiers.first().value == doi
-
- def test_report_stuck_dois(self, stuck_preprint):
- with capture_notifications() as notifications:
- report_stuck_dois(dry_run=False)
-
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_REQUEST_ACCESS_DENIED
diff --git a/osf_tests/management_commands/test_email_all_users.py b/osf_tests/management_commands/test_email_all_users.py
deleted file mode 100644
index 9141e6b50d4..00000000000
--- a/osf_tests/management_commands/test_email_all_users.py
+++ /dev/null
@@ -1,71 +0,0 @@
-import pytest
-
-from django.utils import timezone
-
-from osf.models import NotificationType
-from osf_tests.factories import UserFactory
-
-from osf.management.commands.email_all_users import email_all_users
-from tests.utils import capture_notifications
-
-
-class TestEmailAllUsers:
-
- @pytest.fixture()
- def user(self):
- return UserFactory(id=1)
-
- @pytest.fixture()
- def user2(self):
- return UserFactory(id=2)
-
- @pytest.fixture()
- def superuser(self):
- user = UserFactory()
- user.is_superuser = True
- user.save()
- return user
-
- @pytest.fixture()
- def deleted_user(self):
- return UserFactory(deleted=timezone.now())
-
- @pytest.fixture()
- def inactive_user(self):
- return UserFactory(is_disabled=True)
-
- @pytest.fixture()
- def unconfirmed_user(self):
- return UserFactory(date_confirmed=None)
-
- @pytest.fixture()
- def unregistered_user(self):
- return UserFactory(is_registered=False)
-
- @pytest.mark.django_db
- def test_email_all_users_dry(self, superuser):
- with capture_notifications() as notifications:
- email_all_users('TOU_NOTIF', dry_run=True)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
-
- @pytest.mark.django_db
- def test_dont_email_inactive_users(
- self, deleted_user, inactive_user, unconfirmed_user, unregistered_user):
-
- with capture_notifications() as notifications:
- email_all_users('TOU_NOTIF')
- assert not notifications
-
- @pytest.mark.django_db
- def test_email_all_users_offset(self, user, user2):
- with capture_notifications() as notifications:
- email_all_users('TOU_NOTIF', offset=1, start_id=0)
-
- email_all_users('TOU_NOTIF', offset=1, start_id=1)
-
- email_all_users('TOU_NOTIF', offset=1, start_id=2)
-
- assert len(notifications) == 2
- assert notifications[0]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
- assert notifications[1]['type'] == NotificationType.Type.PROVIDER_MODERATOR_ADDED
diff --git a/osf_tests/test_schema_responses.py b/osf_tests/test_schema_responses.py
index c924aebcd17..51db350814f 100644
--- a/osf_tests/test_schema_responses.py
+++ b/osf_tests/test_schema_responses.py
@@ -863,6 +863,7 @@ def test_accept_notification_sent_on_admin_approval(self, revised_response, admi
with capture_notifications() as notifications:
revised_response.approve(user=admin_user)
assert len(notifications) == 1
+ assert notifications[0]['kwargs']['user'] == admin_user
assert notifications[0]['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED
def test_moderators_notified_on_admin_approval(self, revised_response, admin_user, moderator):
From f9d4249d6586f0511327955b340cbb2f0639451b Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Thu, 31 Jul 2025 09:02:16 -0400
Subject: [PATCH 157/336] fix boa
---
addons/boa/tasks.py | 88 +++++++++++-----------------------
addons/boa/tests/test_tasks.py | 44 ++++++++++-------
2 files changed, 54 insertions(+), 78 deletions(-)
diff --git a/addons/boa/tasks.py b/addons/boa/tasks.py
index 4b8753e5b39..c1918aed640 100644
--- a/addons/boa/tasks.py
+++ b/addons/boa/tasks.py
@@ -1,7 +1,9 @@
+import asyncio
from http.client import HTTPException
import logging
import time
+from asgiref.sync import async_to_sync, sync_to_async
from boaapi.boa_client import BoaClient, BoaException
from boaapi.status import CompilerStatus, ExecutionStatus
from urllib import request
@@ -35,34 +37,14 @@ def submit_to_boa(host, username, password, user_guid, project_guid,
* Running asyncio in celery is tricky. Refer to the discussion below for details:
* https://stackoverflow.com/questions/39815771/how-to-combine-celery-with-asyncio
"""
- return _submit_to_boa(
- host,
- username,
- password,
- user_guid,
- project_guid,
- query_dataset,
- query_file_name,
- file_size,
- file_full_path,
- query_download_url,
- output_upload_url
- )
+ return async_to_sync(submit_to_boa_async)(host, username, password, user_guid, project_guid,
+ query_dataset, query_file_name, file_size, file_full_path,
+ query_download_url, output_upload_url)
-def _submit_to_boa(
- host,
- username,
- password,
- user_guid,
- project_guid,
- query_dataset,
- query_file_name,
- file_size,
- file_full_path,
- query_download_url,
- output_upload_url
-):
+async def submit_to_boa_async(host, username, password, user_guid, project_guid,
+ query_dataset, query_file_name, file_size, file_full_path,
+ query_download_url, output_upload_url):
"""
Download Boa query file, submit it to Boa API, wait for Boa to finish the job
and upload result output to OSF. Send success / failure emails notifications.
@@ -72,24 +54,21 @@ def _submit_to_boa(
* See notes in ``submit_to_boa()`` for details.
"""
- user = OSFUser.objects.get(guids___id=user_guid)
- cookie_value = user.get_or_create_cookie().decode()
+ logger.debug('>>>>>>>> Task begins')
+ user = await sync_to_async(OSFUser.objects.get)(guids___id=user_guid)
+ cookie_value = (await sync_to_async(user.get_or_create_cookie)()).decode()
project_url = f'{osf_settings.DOMAIN}{project_guid}/'
- output_file_name = query_file_name.replace(
- '.boa',
- boa_settings.OUTPUT_FILE_SUFFIX
- )
+ output_file_name = query_file_name.replace('.boa', boa_settings.OUTPUT_FILE_SUFFIX)
if file_size > boa_settings.MAX_SUBMISSION_SIZE:
message = f'Boa query file too large to submit: user=[{user_guid}], project=[{project_guid}], ' \
f'file_name=[{query_file_name}], file_size=[{file_size}], ' \
f'full_path=[{file_full_path}], url=[{query_download_url}] ...'
- handle_boa_error(
+ await sync_to_async(handle_boa_error)(
message,
BoaErrorCode.FILE_TOO_LARGE_ERROR,
user,
- project_url,
- file_full_path,
+ project_url, file_full_path,
query_file_name=query_file_name,
file_size=file_size
)
@@ -104,7 +83,7 @@ def _submit_to_boa(
except (ValueError, HTTPError, URLError, HTTPException):
message = f'Failed to download Boa query file: user=[{user_guid}], project=[{project_guid}], ' \
f'file_name=[{query_file_name}], full_path=[{file_full_path}], url=[{query_download_url}] ...'
- handle_boa_error(
+ await sync_to_async(handle_boa_error)(
message,
BoaErrorCode.UNKNOWN,
user,
@@ -124,7 +103,7 @@ def _submit_to_boa(
except BoaException:
# Don't call `client.close()`, since it will fail with `BoaException` if `client.login()` fails
message = f'Boa login failed: boa_username=[{username}], boa_host=[{host}]!'
- handle_boa_error(
+ await sync_to_async(handle_boa_error)(
message,
BoaErrorCode.AUTHN_ERROR,
user,
@@ -141,7 +120,7 @@ def _submit_to_boa(
except BoaException:
client.close()
message = f'Failed to retrieve or verify the target Boa dataset: dataset=[{query_dataset}]!'
- handle_boa_error(
+ await sync_to_async(handle_boa_error)(
message,
BoaErrorCode.UNKNOWN,
user,
@@ -159,13 +138,12 @@ def _submit_to_boa(
except BoaException:
client.close()
message = f'Failed to submit the query to Boa API: : boa_host=[{host}], dataset=[{query_dataset}]!'
- handle_boa_error(
+ await sync_to_async(handle_boa_error)(
message,
BoaErrorCode.UNKNOWN,
user,
project_url,
- file_full_path,
- query_file_name=query_file_name
+ file_full_path, query_file_name=query_file_name
)
return BoaErrorCode.UNKNOWN
logger.info('Query successfully submitted.')
@@ -174,9 +152,8 @@ def _submit_to_boa(
if time.time() - start_time > boa_settings.MAX_JOB_WAITING_TIME:
client.close()
message = f'Boa job did not complete in time: job_id=[{str(boa_job.id)}]!'
- handle_boa_error(
- message,
- BoaErrorCode.JOB_TIME_OUT_ERROR,
+ await sync_to_async(handle_boa_error)(
+ message, BoaErrorCode.JOB_TIME_OUT_ERROR,
user,
project_url,
file_full_path,
@@ -186,11 +163,11 @@ def _submit_to_boa(
return BoaErrorCode.JOB_TIME_OUT_ERROR
logger.debug(f'Boa job still running, waiting 10s: job_id=[{str(boa_job.id)}] ...')
boa_job.refresh()
- time.sleep(boa_settings.REFRESH_JOB_INTERVAL)
+ await asyncio.sleep(boa_settings.REFRESH_JOB_INTERVAL)
if boa_job.compiler_status is CompilerStatus.ERROR:
client.close()
message = f'Boa job failed with compile error: job_id=[{str(boa_job.id)}]!'
- handle_boa_error(
+ await sync_to_async(handle_boa_error)(
message,
BoaErrorCode.QUERY_ERROR,
user,
@@ -203,7 +180,7 @@ def _submit_to_boa(
elif boa_job.exec_status is ExecutionStatus.ERROR:
client.close()
message = f'Boa job failed with execution error: job_id=[{str(boa_job.id)}]!'
- handle_boa_error(
+ await sync_to_async(handle_boa_error)(
message,
BoaErrorCode.QUERY_ERROR,
user,
@@ -219,7 +196,7 @@ def _submit_to_boa(
except BoaException:
client.close()
message = f'Boa job output is not available: job_id=[{str(boa_job.id)}]!'
- handle_boa_error(
+ await sync_to_async(handle_boa_error)(
message,
BoaErrorCode.OUTPUT_ERROR,
user,
@@ -250,7 +227,7 @@ def _submit_to_boa(
message += f', http_error=[{e.code}: {e.reason}]'
if e.code == 409:
error_code = BoaErrorCode.UPLOAD_ERROR_CONFLICT
- handle_boa_error(
+ await sync_to_async(handle_boa_error)(
message,
error_code,
user,
@@ -283,17 +260,8 @@ def _submit_to_boa(
return BoaErrorCode.NO_ERROR
-def handle_boa_error(
- message,
- code,
- user,
- project_url,
- query_file_full_path,
- query_file_name=None,
- file_size=None,
- output_file_name=None,
- job_id=None
-):
+def handle_boa_error(message, code, user, project_url, query_file_full_path,
+ query_file_name=None, file_size=None, output_file_name=None, job_id=None):
"""Handle Boa and WB API errors and send emails.
"""
logger.error(message)
diff --git a/addons/boa/tests/test_tasks.py b/addons/boa/tests/test_tasks.py
index 1580205048e..c1d2a410679 100644
--- a/addons/boa/tests/test_tasks.py
+++ b/addons/boa/tests/test_tasks.py
@@ -29,8 +29,7 @@ class TestBoaErrorHandling(OsfTestCase):
def setUp(self):
super().setUp()
self.error_message = 'fake-error-message'
- self.user_username = 'fake-user-username'
- self.user_fullname = 'fake-user-fullname'
+ self.user = AuthUserFactory()
self.project_url = 'http://localhost:5000/1a2b3'
self.query_file_name = 'fake_boa_script.boa'
self.file_size = 255
@@ -57,7 +56,7 @@ def test_handle_boa_error(self):
return_value = handle_boa_error(
self.error_message,
BoaErrorCode.UNKNOWN,
- self.user_username,
+ self.user,
self.project_url,
self.file_full_path,
self.query_file_name,
@@ -88,21 +87,30 @@ def setUp(self):
self.query_download_url = f'http://localhost:7777/v1/resources/{self.project_guid}/providers/osfstorage/1a2b3c4d'
self.output_upload_url = f'http://localhost:7777/v1/resources/{self.project_guid}/providers/osfstorage/?kind=file'
+ def tearDown(self):
+ super().tearDown()
+
def test_submit_to_boa_async_called(self):
- return_value = submit_to_boa(
- self.host,
- self.username,
- self.password,
- self.user_guid,
- self.project_guid,
- self.query_dataset,
- self.query_file_name,
- self.file_size,
- self.file_full_path,
- self.query_download_url,
- self.output_upload_url
- )
- assert return_value == BoaErrorCode.NO_ERROR
+ with mock.patch(
+ 'addons.boa.tasks.submit_to_boa_async',
+ new_callable=AsyncMock,
+ return_value=BoaErrorCode.NO_ERROR
+ ) as mock_submit_to_boa_async:
+ return_value = submit_to_boa(
+ self.host,
+ self.username,
+ self.password,
+ self.user_guid,
+ self.project_guid,
+ self.query_dataset,
+ self.query_file_name,
+ self.file_size,
+ self.file_full_path,
+ self.query_download_url,
+ self.output_upload_url
+ )
+ assert return_value == BoaErrorCode.NO_ERROR
+ mock_submit_to_boa_async.assert_called()
@pytest.mark.django_db
@@ -150,7 +158,7 @@ async def test_submit_success(self):
mock.patch('asyncio.sleep', new_callable=AsyncMock, return_value=None) as mock_async_sleep, \
mock.patch('addons.boa.tasks.handle_boa_error', return_value=None) as mock_handle_boa_error:
with capture_notifications() as notifications:
- return_value = submit_to_boa(
+ return_value = await submit_to_boa(
self.host,
self.username,
self.password,
From 4c2d14562fc8f240e8e875596ff8d7e73ab21069 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Thu, 31 Jul 2025 09:15:59 -0400
Subject: [PATCH 158/336] fix mock user
---
api/crossref/views.py | 6 ++++--
1 file changed, 4 insertions(+), 2 deletions(-)
diff --git a/api/crossref/views.py b/api/crossref/views.py
index d93d5b43ef2..17bba7a3281 100644
--- a/api/crossref/views.py
+++ b/api/crossref/views.py
@@ -78,8 +78,10 @@ def post(self, request):
if unexpected_errors:
email_error_text = request.POST['body-plain']
batch_id = crossref_email_content.find('batch_id').text
- NotificationType.objects.get(name=NotificationType.Type.DESK_OSF_SUPPORT_EMAIL).emit(
- user=type('staff', (), {'username': settings.OSF_SUPPORT_EMAIL}),
+ NotificationType.objects.get(
+ name=NotificationType.Type.DESK_OSF_SUPPORT_EMAIL,
+ ).emit(
+ destination_address=settings.OSF_SUPPORT_EMAIL,
event_context={
'batch_id': batch_id,
'email_content': request.POST['body-plain'],
From 8d9a26fdcd3f3671bb26db1a86ce98deaa1d64d0 Mon Sep 17 00:00:00 2001
From: ihorsokhanexoft
Date: Thu, 31 Jul 2025 16:31:27 +0300
Subject: [PATCH 159/336] [ENG-8401] Earlier preprint versions download the
current file (#11245)
* fixed earlier version download the newest version file
* fixed tests
---
addons/base/views.py | 11 +++++++----
1 file changed, 7 insertions(+), 4 deletions(-)
diff --git a/addons/base/views.py b/addons/base/views.py
index 2c61fdda232..2eee0ae0dd6 100644
--- a/addons/base/views.py
+++ b/addons/base/views.py
@@ -1006,14 +1006,17 @@ def persistent_file_download(auth, **kwargs):
file = BaseFileNode.active.filter(_id=id_or_guid).first()
if not file:
guid = Guid.load(id_or_guid)
- if guid:
- referent = guid.referent
- file = referent.primary_file if type(referent) is Preprint else referent
- else:
+ if not guid:
raise HTTPError(http_status.HTTP_404_NOT_FOUND, data={
'message_short': 'File Not Found',
'message_long': 'The requested file could not be found.'
})
+
+ file = guid.referent
+ if type(file) is Preprint:
+ referent, _ = Guid.load_referent(id_or_guid)
+ file = referent.primary_file
+
if not file.is_file:
raise HTTPError(http_status.HTTP_400_BAD_REQUEST, data={
'message_long': 'Downloading folders is not permitted.'
From 7ccc8fb035c96885e2c6ef53ed36485fcae624d2 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Thu, 31 Jul 2025 10:10:21 -0400
Subject: [PATCH 160/336] fix schema response tests
---
api/providers/tasks.py | 2 +-
osf/email/__init__.py | 1 +
osf/models/schema_response.py | 16 ++++----
osf/utils/notifications.py | 4 +-
osf_tests/test_archiver.py | 14 ++++---
osf_tests/test_schema_responses.py | 22 +++++++----
website/notifications/listeners.py | 62 ++++++++++++++++++++++++++++++
website/reviews/listeners.py | 56 +++++++++++++++++++++++----
8 files changed, 144 insertions(+), 33 deletions(-)
diff --git a/api/providers/tasks.py b/api/providers/tasks.py
index 5891494cfb2..9896447bc69 100644
--- a/api/providers/tasks.py
+++ b/api/providers/tasks.py
@@ -705,7 +705,7 @@ def inform_product_of_errors(initiator=None, provider=None, message=None):
NotificationType.objects.get(
name=NotificationType.Type.DESK_REGISTRATION_BULK_UPLOAD_PRODUCT_OWNER,
).emit(
- user=object('mockuser', (), {'username': email}),
+ destination_address=email,
event_context={
'user': user_info,
'provider_name': provider_name,
diff --git a/osf/email/__init__.py b/osf/email/__init__.py
index 5b2dae93a04..39819741cb2 100644
--- a/osf/email/__init__.py
+++ b/osf/email/__init__.py
@@ -61,6 +61,7 @@ def send_email_with_send_grid(to_addr, notification_type, context, email_context
to_addr (str): The recipient's email address.
notification_type (str): The subject of the notification.
context (dict): The email content context.
+ email_context (dict): The email context for sending, such as header changes for BCC or reply-to
"""
if not settings.SENDGRID_API_KEY:
raise NotImplementedError('SENDGRID_API_KEY is required for sendgrid notifications.')
diff --git a/osf/models/schema_response.py b/osf/models/schema_response.py
index 3c4f65155fb..b51256c3ee8 100644
--- a/osf/models/schema_response.py
+++ b/osf/models/schema_response.py
@@ -22,13 +22,6 @@
from website.settings import DOMAIN
-EMAIL_TEMPLATES_PER_EVENT = {
- 'create': NotificationType.Type.NODE_SCHEMA_RESPONSE_INITIATED,
- 'submit': NotificationType.Type.NODE_SCHEMA_RESPONSE_SUBMITTED,
- 'accept': NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED,
- 'reject': NotificationType.Type.NODE_SCHEMA_RESPONSE_REJECTED,
-}
-
class SchemaResponse(ObjectIDMixin, BaseModel):
'''Collects responses for a schema associated with a parent object.
@@ -483,10 +476,15 @@ def _notify_users(self, event, event_initiator):
reviews_email_submit_moderators_notifications.send(
timestamp=timezone.now(),
context=email_context,
- user=self.initiator
+ resource=self.parent
)
- template = EMAIL_TEMPLATES_PER_EVENT.get(event)
+ template = {
+ 'create': NotificationType.Type.NODE_SCHEMA_RESPONSE_INITIATED,
+ 'submit': NotificationType.Type.NODE_SCHEMA_RESPONSE_SUBMITTED,
+ 'accept': NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED,
+ 'reject': NotificationType.Type.NODE_SCHEMA_RESPONSE_REJECTED,
+ }.get(event)
if not template:
return
diff --git a/osf/utils/notifications.py b/osf/utils/notifications.py
index 76d6e255668..ee95a3cb811 100644
--- a/osf/utils/notifications.py
+++ b/osf/utils/notifications.py
@@ -9,11 +9,13 @@ def get_email_template_context(resource):
is_preprint = resource.provider.type == 'osf.preprintprovider'
url_segment = 'preprints' if is_preprint else 'registries'
document_type = resource.provider.preprint_word if is_preprint else 'registration'
+ from website.profile.utils import get_profile_image_url
base_context = {
'domain': DOMAIN,
'reviewable_title': resource.title,
'reviewable_absolute_url': resource.absolute_url,
+ 'profile_image_url': get_profile_image_url(resource.creator),
'reviewable_provider_name': resource.provider.name,
'workflow': resource.provider.reviews_workflow,
'provider_url': resource.provider.domain or f'{DOMAIN}{url_segment}/{resource.provider._id}',
@@ -48,7 +50,6 @@ def notify_submit(resource, user, *args, **kwargs):
timestamp=timezone.now(),
context=context,
resource=resource,
- user=user,
)
@@ -67,7 +68,6 @@ def notify_resubmit(resource, user, *args, **kwargs):
timestamp=timezone.now(),
context=context,
resource=resource,
- user=user
)
diff --git a/osf_tests/test_archiver.py b/osf_tests/test_archiver.py
index 34394e9a39c..282c0c99ddd 100644
--- a/osf_tests/test_archiver.py
+++ b/osf_tests/test_archiver.py
@@ -1209,11 +1209,13 @@ def test_archiver_uncaught_error_mail_renders():
src = factories.ProjectFactory()
user = src.creator
job = factories.ArchiveJobFactory()
- mail = mails.ARCHIVE_UNCAUGHT_ERROR_DESK
- assert mail.html(
+ notification_type = NotificationType.Type.DESK_ARCHIVE_JOB_UNCAUGHT_ERROR.instance
+ assert notification_type.emit(
user=user,
- src=src,
- results=job.target_addons.all(),
- url=settings.INTERNAL_DOMAIN + src._id,
- can_change_preferences=False,
+ event_context=dict(
+ src=str(src),
+ results=list(job.target_addons.all()),
+ url=settings.INTERNAL_DOMAIN + src._id,
+ can_change_preferences=False,
+ )
)
diff --git a/osf_tests/test_schema_responses.py b/osf_tests/test_schema_responses.py
index 51db350814f..e138c8e01e4 100644
--- a/osf_tests/test_schema_responses.py
+++ b/osf_tests/test_schema_responses.py
@@ -855,16 +855,20 @@ def test_schema_response_action_to_state_following_moderated_approve_is_pending_
assert new_action.to_state == ApprovalStates.PENDING_MODERATION.db_name
assert new_action.trigger == SchemaResponseTriggers.APPROVE.db_name
- def test_accept_notification_sent_on_admin_approval(self, revised_response, admin_user):
+ def test_accept_notification_sent_on_admin_approval(self, revised_response, admin_user, moderator):
revised_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
revised_response.save()
revised_response.pending_approvers.add(admin_user)
with capture_notifications() as notifications:
revised_response.approve(user=admin_user)
- assert len(notifications) == 1
- assert notifications[0]['kwargs']['user'] == admin_user
- assert notifications[0]['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED
+ assert len(notifications) == 3
+ assert notifications[0]['kwargs']['user'] == moderator
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
+ assert notifications[1]['kwargs']['user'] == moderator
+ assert notifications[1]['type'] == NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
+ assert notifications[2]['kwargs']['user'] == admin_user
+ assert notifications[2]['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED
def test_moderators_notified_on_admin_approval(self, revised_response, admin_user, moderator):
revised_response.approvals_state_machine.set_state(ApprovalStates.UNAPPROVED)
@@ -873,9 +877,13 @@ def test_moderators_notified_on_admin_approval(self, revised_response, admin_use
with capture_notifications() as notifications:
revised_response.approve(user=admin_user)
- assert len(notifications) == 1
- assert notifications[0]['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED
+ assert len(notifications) == 3
assert notifications[0]['kwargs']['user'] == moderator
+ assert notifications[0]['type'] == NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
+ assert notifications[1]['kwargs']['user'] == moderator
+ assert notifications[1]['type'] == NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
+ assert notifications[2]['kwargs']['user'] == admin_user
+ assert notifications[2]['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED
def test_no_moderator_notification_on_admin_approval_of_initial_response(
self, initial_response, admin_user):
@@ -915,7 +923,7 @@ def test_moderator_accept_notification(
with capture_notifications() as notifications:
revised_response.accept(user=moderator)
assert len(notifications) == 3
- assert all(notification['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_INITIATED
+ assert all(notification['type'] == NotificationType.Type.NODE_SCHEMA_RESPONSE_APPROVED
for notification in notifications)
def test_no_moderator_accept_notification_on_initial_response(
diff --git a/website/notifications/listeners.py b/website/notifications/listeners.py
index 871d6d56792..ceae7ba6e10 100644
--- a/website/notifications/listeners.py
+++ b/website/notifications/listeners.py
@@ -8,6 +8,7 @@
from framework.auth.signals import user_confirmed
from website.project.signals import privacy_set_public
from website import settings
+from website.reviews import signals as reviews_signals
logger = logging.getLogger(__name__)
@@ -76,3 +77,64 @@ def queue_first_public_project_email(user, node):
'osf_url': settings.DOMAIN,
}
)
+
+@reviews_signals.reviews_email_submit_moderators_notifications.connect
+def reviews_submit_notification_moderators(self, timestamp, context, resource):
+ """
+ Handle email notifications to notify moderators of new submissions or resubmission.
+ """
+ # imports moved here to avoid AppRegistryNotReady error
+ from osf.models import NotificationType
+ from website.settings import DOMAIN
+
+ provider = resource.provider
+
+ # Set submission url
+ if provider.type == 'osf.preprintprovider':
+ context['reviews_submission_url'] = (
+ f'{DOMAIN}reviews/preprints/{provider._id}/{resource._id}'
+ )
+ elif provider.type == 'osf.registrationprovider':
+ context['reviews_submission_url'] = f'{DOMAIN}{resource._id}?mode=moderator'
+ else:
+ raise NotImplementedError(f'unsupported provider type {provider.type}')
+
+ # Set message
+ revision_id = context.get('revision_id')
+ if revision_id:
+ context['message'] = f'submitted updates to "{resource.title}".'
+ context['reviews_submission_url'] += f'&revisionId={revision_id}'
+ else:
+ if context.get('resubmission'):
+ context['message'] = f'resubmitted "{resource.title}".'
+ else:
+ context['message'] = f'submitted "{resource.title}".'
+ provider_subscription, created = NotificationSubscription.objects.get_or_create(
+ notification_type__name=NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS,
+ object_id=provider.id,
+ content_type=ContentType.objects.get_for_model(provider.__class__),
+ )
+ for recipient in provider_subscription.subscribed_object.get_group('moderator').user_set.all():
+ NotificationType.objects.get(
+ name=NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
+ ).emit(
+ user=recipient,
+ event_context=context
+ )
+
+# Handle email notifications to notify moderators of new submissions.
+@reviews_signals.reviews_withdraw_requests_notification_moderators.connect
+def reviews_withdraw_requests_notification_moderators(self, timestamp, context, user, resource):
+ from website.settings import DOMAIN
+
+ provider = resource.provider
+ # Set message
+ context['message'] = f'has requested withdrawal of "{resource.title}".'
+ # Set submission url
+ context['reviews_submission_url'] = f'{DOMAIN}reviews/registries/{provider._id}/{resource._id}'
+ NotificationType.objects.get(
+ name=NotificationType.Type.PROVIDER_NEW_PENDING_WITHDRAW_REQUESTS
+ ).emit(
+ user=user,
+ event_context=context
+ )
diff --git a/website/reviews/listeners.py b/website/reviews/listeners.py
index be4b3ff7c82..d208bdb099a 100644
--- a/website/reviews/listeners.py
+++ b/website/reviews/listeners.py
@@ -1,8 +1,9 @@
+from django.contrib.contenttypes.models import ContentType
+
from osf.models import NotificationType
from website.settings import DOMAIN, OSF_PREPRINTS_LOGO, OSF_REGISTRIES_LOGO
from website.reviews import signals as reviews_signals
-
@reviews_signals.reviews_withdraw_requests_notification_moderators.connect
def reviews_withdraw_requests_notification_moderators(self, timestamp, context, user, resource):
context['referrer_fullname'] = user.fullname
@@ -15,10 +16,8 @@ def reviews_withdraw_requests_notification_moderators(self, timestamp, context,
object_id=provider.id,
content_type=ContentType.objects.get_for_model(provider.__class__),
)
- from website.profile.utils import get_profile_image_url
context['message'] = f'has requested withdrawal of "{resource.title}".'
- context['profile_image_url'] = get_profile_image_url(user)
context['reviews_submission_url'] = f'{DOMAIN}reviews/registries/{provider._id}/{resource._id}'
for recipient in provider_subscription.subscribed_object.get_group('moderator').user_set.all():
@@ -29,7 +28,6 @@ def reviews_withdraw_requests_notification_moderators(self, timestamp, context,
event_context=context,
)
-
@reviews_signals.reviews_email_withdrawal_requests.connect
def reviews_withdrawal_requests_notification(self, timestamp, context):
preprint = context['reviewable']
@@ -42,13 +40,55 @@ def reviews_withdrawal_requests_notification(self, timestamp, context):
object_id=preprint.provider.id,
content_type=ContentType.objects.get_for_model(preprint.provider.__class__),
)
- from website.profile.utils import get_profile_image_url
-
context['message'] = f'has requested withdrawal of the {preprint_word} "{preprint.title}".'
- context['profile_image_url'] = get_profile_image_url(context['requester'])
context['reviews_submission_url'] = f'{DOMAIN}reviews/preprints/{preprint.provider._id}/{preprint._id}'
- for recipient in provider_subscription.preorint.contributors.all():
+ for recipient in provider_subscription.subscribed_object.get_group('moderator').user_set.all():
+ NotificationType.objects.get(
+ name=NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
+ ).emit(
+ user=recipient,
+ event_context=context,
+ )
+
+@reviews_signals.reviews_email_submit_moderators_notifications.connect
+def reviews_submit_notification_moderators(self, timestamp, resource, context):
+ """
+ Handle email notifications to notify moderators of new submissions or resubmission.
+ """
+ # imports moved here to avoid AppRegistryNotReady error
+ from osf.models import NotificationSubscription
+
+ provider = resource.provider
+
+ # Set submission url
+ if provider.type == 'osf.preprintprovider':
+ context['reviews_submission_url'] = (
+ f'{DOMAIN}reviews/preprints/{provider._id}/{resource._id}'
+ )
+ elif provider.type == 'osf.registrationprovider':
+ context['reviews_submission_url'] = f'{DOMAIN}{resource._id}?mode=moderator'
+ else:
+ raise NotImplementedError(f'unsupported provider type {provider.type}')
+
+ # Set message
+ revision_id = context.get('revision_id')
+ if revision_id:
+ context['message'] = f'submitted updates to "{resource.title}".'
+ context['reviews_submission_url'] += f'&revisionId={revision_id}'
+ else:
+ if context.get('resubmission'):
+ context['message'] = f'resubmitted "{resource.title}".'
+ else:
+ context['message'] = f'submitted "{resource.title}".'
+
+ # Get NotificationSubscription instance, which contains reference to all subscribers
+ provider_subscription, created = NotificationSubscription.objects.get_or_create(
+ notification_type__name=NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS,
+ object_id=provider.id,
+ content_type=ContentType.objects.get_for_model(provider.__class__),
+ )
+ for recipient in provider_subscription.subscribed_object.get_group('moderator').user_set.all():
NotificationType.objects.get(
name=NotificationType.Type.PROVIDER_NEW_PENDING_SUBMISSIONS
).emit(
From c41802164d235b1bcc0f0f32581d6d48f3aa765a Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Thu, 31 Jul 2025 10:50:28 -0400
Subject: [PATCH 161/336] fix insti reporter tests
---
osf/models/preprint.py | 1 +
.../reporters/test_institutional_summary_reporter.py | 2 ++
osf_tests/test_node.py | 6 ------
3 files changed, 3 insertions(+), 6 deletions(-)
diff --git a/osf/models/preprint.py b/osf/models/preprint.py
index a2415643e2a..870099f0623 100644
--- a/osf/models/preprint.py
+++ b/osf/models/preprint.py
@@ -1052,6 +1052,7 @@ def _send_preprint_confirmation(self, auth):
NotificationType.objects.get(
name=NotificationType.Type.PROVIDER_REVIEWS_SUBMISSION_CONFIRMATION
).emit(
+ subscribed_object=self.provider,
user=recipient,
event_context=context,
)
diff --git a/osf_tests/metrics/reporters/test_institutional_summary_reporter.py b/osf_tests/metrics/reporters/test_institutional_summary_reporter.py
index 05baa4d38e7..b03e4de1161 100644
--- a/osf_tests/metrics/reporters/test_institutional_summary_reporter.py
+++ b/osf_tests/metrics/reporters/test_institutional_summary_reporter.py
@@ -12,12 +12,14 @@
AuthUserFactory,
)
from ._testutils import list_monthly_reports
+from osf.management.commands.populate_notification_types import populate_notification_types
class TestInstiSummaryMonthlyReporter(TestCase):
@classmethod
def setUpTestData(cls):
+ populate_notification_types()
cls._yearmonth = YearMonth(2018, 2) # February 2018
cls._institution = InstitutionFactory()
cls._now = datetime.datetime(2018, 2, 4, tzinfo=datetime.UTC)
diff --git a/osf_tests/test_node.py b/osf_tests/test_node.py
index e6a34c31050..6348c87a144 100644
--- a/osf_tests/test_node.py
+++ b/osf_tests/test_node.py
@@ -2140,12 +2140,6 @@ def test_set_privacy_sends_mail(self, node, auth):
assert len(notifications) == 1
assert notifications[0]['type'] == NotificationType.Type.USER_NEW_PUBLIC_PROJECT
- def test_set_privacy_skips_mail_if_meeting(self, node, auth):
- with capture_notifications() as notifications:
- node.set_privacy('private', auth=auth)
- node.set_privacy('public', auth=auth, meeting_creation=True)
- assert not notifications
-
def test_set_privacy_can_not_cancel_pending_embargo_for_registration(self, node, user, auth):
registration = RegistrationFactory(project=node)
registration.embargo_registration(
From 57ba198f4739d1efcf9d0d0dc96a52347d964188 Mon Sep 17 00:00:00 2001
From: John Tordoff
Date: Thu, 31 Jul 2025 12:09:15 -0400
Subject: [PATCH 162/336] clean resource contributors create method
---
api/nodes/serializers.py | 50 ++++++++++++++++------------------------
1 file changed, 20 insertions(+), 30 deletions(-)
diff --git a/api/nodes/serializers.py b/api/nodes/serializers.py
index c52787f2a6c..2fbcc9f4160 100644
--- a/api/nodes/serializers.py
+++ b/api/nodes/serializers.py
@@ -1237,44 +1237,35 @@ def validate_data(self, resource, user_id=None, full_name=None, email=None, inde
raise exceptions.ValidationError(detail=f'{index} is not a valid contributor index for node with id {resource._id}')
def create(self, validated_data):
- id = validated_data.get('_id')
- email = validated_data.get('user', {}).get('email', None)
- index = None
- if '_order' in validated_data:
- index = validated_data.pop('_order')
+ user_id = validated_data.get('_id')
+ email = validated_data.get('user', {}).get('email')
+ index = validated_data.pop('_order', None)
resource = self.context['resource']
auth = Auth(self.context['request'].user)
full_name = validated_data.get('full_name')
bibliographic = validated_data.get('bibliographic')
- email_preference = self.context['request'].GET.get('send_email') or self.context['default_email']
+ email_pref = self.context['request'].GET.get('send_email') or self.context['default_email']
permissions = self.get_proposed_permissions(validated_data)
- self.validate_data(
- resource,
- user_id=id,
- full_name=full_name,
- email=email,
- index=index,
- )
- if email_preference not in self.email_preferences:
- raise exceptions.ValidationError(detail=f'{email_preference} is not a valid email preference.')
-
- contributor = OSFUser.load(id)
- if email or (contributor and contributor.is_registered):
- is_published = getattr(resource, 'is_published', False)
- notification_type = {
- 'false': False,
- 'default': NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT,
- 'draft_registration': NotificationType.Type.DRAFT_REGISTRATION_CONTRIBUTOR_ADDED_DEFAULT,
- 'preprint': NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT if is_published else False,
- }[email_preference]
- else:
- notification_type = False
+ self.validate_data(resource, user_id=user_id, full_name=full_name, email=email, index=index)
+
+ if email_pref not in self.email_preferences:
+ raise exceptions.ValidationError(f'{email_pref} is not a valid email preference.')
+
+ is_published = getattr(resource, 'is_published', False)
+ notification_type = {
+ 'false': False,
+ 'default': NotificationType.Type.NODE_CONTRIBUTOR_ADDED_DEFAULT,
+ 'draft_registration': NotificationType.Type.DRAFT_REGISTRATION_CONTRIBUTOR_ADDED_DEFAULT,
+ 'preprint': NotificationType.Type.PREPRINT_CONTRIBUTOR_ADDED_DEFAULT if is_published else False,
+ }.get(email_pref, False)
+ contributor = OSFUser.load(user_id)
+ notification_type = notification_type if email or (contributor and contributor.is_registered) else False,
try:
- contributor_obj = resource.add_contributor_registered_or_not(
+ return resource.add_contributor_registered_or_not(
auth=auth,
- user_id=id,
+ user_id=user_id,
email=email,
full_name=full_name,
notification_type=notification_type,
@@ -1286,7 +1277,6 @@ def create(self, validated_data):
raise exceptions.ValidationError(detail=e.messages[0])
except ValueError as e:
raise exceptions.NotFound(detail=e.args[0])
- return contributor_obj
class NodeContributorDetailSerializer(NodeContributorsSerializer):
"""
From 8745da462fb1dd777b8f63de2cebef9825306d2f Mon Sep 17 00:00:00 2001
From: antkryt
Date: Thu, 31 Jul 2025 20:30:11 +0300
Subject: [PATCH 163/336] [ENG-8462] Institution setup fixes (#11241)
* create monthly reports when institution is created
* better exception handling; async generate report
* fix test
* handle deactivated institutions; minor fixes
---
admin/institutions/urls.py | 1 +
admin/institutions/views.py | 48 +++++++++++++++++++
admin/management/views.py | 1 +
admin/templates/institutions/detail.html | 35 ++++++++++++++
admin/templates/management/commands.html | 2 +-
admin_tests/institutions/test_views.py | 28 +++++++++++
api/institutions/views.py | 11 +++--
.../commands/monthly_reporters_go.py | 7 ++-
osf/metrics/reports.py | 10 ++--
9 files changed, 133 insertions(+), 10 deletions(-)
diff --git a/admin/institutions/urls.py b/admin/institutions/urls.py
index 8d12a9fe36c..6aa5cf7e0df 100644
--- a/admin/institutions/urls.py
+++ b/admin/institutions/urls.py
@@ -9,6 +9,7 @@
re_path(r'^import/$', views.ImportInstitution.as_view(), name='import'),
re_path(r'^(?P[0-9]+)/$', views.InstitutionDetail.as_view(), name='detail'),
re_path(r'^(?P[0-9]+)/export/$', views.InstitutionExport.as_view(), name='export'),
+ re_path(r'^(?P[0-9]+)/monthly_report/$', views.InstitutionMonthlyReporterDo.as_view(), name='monthly_report'),
re_path(r'^(?P[0-9]+)/delete/$', views.DeleteInstitution.as_view(), name='delete'),
re_path(r'^(?P[0-9]+)/deactivate/$', views.DeactivateInstitution.as_view(), name='deactivate'),
re_path(r'^(?P[0-9]+)/reactivate/$', views.ReactivateInstitution.as_view(), name='reactivate'),
diff --git a/admin/institutions/views.py b/admin/institutions/views.py
index ad9f0c7571f..a7b76bc9109 100644
--- a/admin/institutions/views.py
+++ b/admin/institutions/views.py
@@ -1,4 +1,5 @@
import json
+from dateutil.parser import isoparse
from django.contrib import messages
from django.contrib.auth.mixins import PermissionRequiredMixin
@@ -15,6 +16,9 @@
from admin.base.forms import ImportFileForm
from admin.institutions.forms import InstitutionForm, InstitutionalMetricsAdminRegisterForm
from osf.models import Institution, Node, OSFUser
+from osf.metrics.utils import YearMonth
+from osf.metrics.reporters import AllMonthlyReporters
+from osf.management.commands.monthly_reporters_go import monthly_reporter_do
class InstitutionList(PermissionRequiredMixin, ListView):
@@ -129,6 +133,38 @@ def get(self, request, *args, **kwargs):
return response
+class InstitutionMonthlyReporterDo(PermissionRequiredMixin, View):
+ permission_required = 'osf.view_institution'
+ raise_exception = True
+
+ def post(self, request, *args, **kwargs):
+ institution_id = self.kwargs.get('institution_id')
+ try:
+ institution = Institution.objects.get_all_institutions().get(id=institution_id)
+ except Institution.DoesNotExist:
+ raise Http404(f"Institution with id {institution_id} is not found or deactivated.")
+
+ monthly_report_date = request.POST.get('monthly_report_date', None)
+ if monthly_report_date:
+ try:
+ monthly_report_date = isoparse(monthly_report_date).date()
+ except ValueError as exc:
+ messages.error(request, str(exc))
+ return redirect('institutions:detail', institution_id=institution.id)
+ else:
+ messages.error(request, 'Report date cannot be none.')
+ return redirect('institutions:detail', institution_id=institution.id)
+
+ monthly_reporter_do.apply_async(kwargs={
+ 'yearmonth': str(YearMonth.from_date(monthly_report_date)),
+ 'reporter_key': request.POST.get('monthly_reporter', None),
+ 'report_kwargs': {'institution_pk': institution.id},
+ })
+
+ messages.success(request, 'Monthly reporter successfully went.')
+ return redirect('institutions:detail', institution_id=institution.id)
+
+
class CreateInstitution(PermissionRequiredMixin, CreateView):
permission_required = 'osf.change_institution'
raise_exception = True
@@ -141,6 +177,18 @@ def get_context_data(self, *args, **kwargs):
kwargs['import_form'] = ImportFileForm()
return super().get_context_data(*args, **kwargs)
+ def form_valid(self, form):
+ response = super().form_valid(form)
+
+ # Make a report after Institution is created
+ monthly_reporter_do.apply_async(kwargs={
+ 'yearmonth': str(YearMonth.from_date(self.object.created)),
+ 'reporter_key': AllMonthlyReporters.INSTITUTIONAL_SUMMARY.name,
+ 'report_kwargs': {'institution_pk': self.object.id},
+ })
+
+ return response
+
class InstitutionNodeList(PermissionRequiredMixin, ListView):
template_name = 'institutions/node_list.html'
diff --git a/admin/management/views.py b/admin/management/views.py
index 525f0d8d64a..d97e4f4b894 100644
--- a/admin/management/views.py
+++ b/admin/management/views.py
@@ -130,6 +130,7 @@ def post(self, request, *args, **kwargs):
if report_date is not None
else ''
),
+ reporter_key=request.POST.get('monthly_reporter', '')
)
if errors:
diff --git a/admin/templates/institutions/detail.html b/admin/templates/institutions/detail.html
index 2bede6e7d92..47315d8e8d7 100644
--- a/admin/templates/institutions/detail.html
+++ b/admin/templates/institutions/detail.html
@@ -9,9 +9,17 @@
{% endblock title %}
{% block content %}