-
-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(onboarding): Bulletproof critical back actions with new tests #88532
feat(onboarding): Bulletproof critical back actions with new tests #88532
Conversation
priscilawebdev
commented
Apr 2, 2025
•
edited
Loading
edited
- closes Retro: Add acceptance tests for flow that broke in onboarding #88302
src/sentry/testutils/asserts.py
Outdated
@@ -45,6 +50,22 @@ def assert_status_code(response, minimum: int, maximum: int | None = None): | |||
) | |||
|
|||
|
|||
def verify_project_deletion(org: Organization, platform: str): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this test is used in 2x different places, so I aimed to make it reusable. However, I'm unsure if this is the best location for it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
small nit: but all but one function in that file is called assert_*
, to keep consistency can we rename this to assert_project_deletion
?
Codecov ReportAll modified and coverable lines are covered by tests ✅ ✅ All tests successful. No failed tests found. Additional details and impacted files@@ Coverage Diff @@
## master #88532 +/- ##
==========================================
- Coverage 87.77% 87.76% -0.02%
==========================================
Files 10078 10042 -36
Lines 571566 568962 -2604
Branches 22372 22235 -137
==========================================
- Hits 501705 499329 -2376
+ Misses 69462 69217 -245
- Partials 399 416 +17 |
|
||
self.load_project_creation_page() | ||
|
||
# Select PHP Laravel platform |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we remove this and the following comments? I dont see the value of having a comment on every line. Thx
src/sentry/testutils/asserts.py
Outdated
try: | ||
# we need to check for the status here because we do soft deletions | ||
Project.objects.get(organization=org, slug=platform, status=ObjectStatus.ACTIVE) | ||
except ObjectDoesNotExist: | ||
# If the project doesn't exist anymore, it's deleted | ||
return # The project was successfully deleted |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we do soft or hard deletions? or both?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I noticed that we do soft deletion when deleting a project.
src/sentry/testutils/asserts.py
Outdated
# Poll the database to check if the project was deleted | ||
# the timeout is set to 10 seconds | ||
start_time = time.time() | ||
while time.time() - start_time < 10: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we sure this timeout is needed? Asking because it will make our acceptance tests runs even slower.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does take some time for the project to be removed, and we need to wait for that. Do you know of some another way to achieve that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking at other acceptance tests, can we do similar and wait using self.browser.wait_until_not
- the only question is what is the right thing to wait for?
Current approach will produce a lot of unnecessary db calls, even if it is only in test environment, and it will make tests slower than they should be
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The issue is that we silently delete the project, and this change isn't reflected in the DOM, so I can't use self.browser.wait_until_not
. I added the database query because I noticed it was used frequently in the acceptance tests, but if that's not the preferred approach, I'm happy to remove it. In the tests I've written, we already navigate to the project overview page and check if the project is rendered or not using self.browser.wait_until
. The database query was just an extra measure to be completely certain.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unrelated to this PR, but more to the general UX - do we want to let the user know that we have deleted the project?
Or do we want to do all of this silently without user ever knowing that the project was created and then deleted?
Because this can lead to onbaording task (FIRST_PROJECT
) being marked as completed - without user ever being aware that they have created a project
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We want to do it silently without users knowing the project has been deleted. They will still see this info in the audit log, but that is ok - we have discussed this internally.
Because this can lead to onbaording task (FIRST_PROJECT) being marked as completed - without user ever being aware that they have created a project
That's a great point. This scenario could occur if the user:
- Creates a project
- Navigates back in the flow
- Clicks "Skip Onboarding"
While it’s a rare edge case, I’ll bring this up to the team. Thanks for flagging this!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but more to the general UX - do we want to let the user know that we have deleted the project?
we used to have this modal before that made it transparent a project had been created and delete and it was kind of awkward. Mostly because for the user it didn't really feel like they had created a project, but just picked a platform on the previous screen. It's really just for technical reasons that we need to already create a project at that stage in order to generate a DSN for the code snippets we show in the next screen. So we decided to remove it.
Because this can lead to onbaording task (FIRST_PROJECT) being marked as completed - without user ever being aware that they have created a project
that is a good observation and I would definitely say it's an unwanted side-effect. My preferred option would be to also revert this if the project is deleted in the onboarding flow, so the task doesn't appear as completed.
IF that's very complicated, then I agree with Pri that the scenario is probably rare... if you do go back, it's likely because you wanted to choose a different platform, so probably you'll still continue with the flow and create another project anyway. Since it's just in this "go back and then skip" scenario, it's probably reasonably rare that we can live with it.
I am missing the regression test, we wanted to have:
|
Content wise it looks good to me 👍 |
…etproof-critical-actions
tests/acceptance/test_onboarding.py
Outdated
def test_project_deletion_on_going_back(self): | ||
self.start_onboarding() | ||
self.click_on_platform("javascript-nextjs") | ||
self.verify_project_creation("javascript-nextjs", "Next.js") | ||
self.browser.click('[aria-label="Back"]') | ||
|
||
# Assert no deletion confirm dialog is shown | ||
assert not self.browser.element_exists("[role='dialog']") | ||
|
||
# Platform selection step | ||
self.browser.wait_until('[data-test-id="onboarding-step-select-platform"]') | ||
|
||
# Select generic platform | ||
self.click_on_platform("javascript-react") | ||
self.verify_project_creation("javascript-react", "React") | ||
self.browser.back() | ||
self.browser.click(xpath='//a[text()="Skip Onboarding"]') | ||
self.browser.get("/organizations/%s/projects/" % self.org.slug) | ||
self.browser.wait_until(xpath='//h1[text()="Remain Calm"]') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where is the project deletion verified here?
I only see that this tests for project creation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed it, and I am checking only for what we are rendering since you said the function would slow tests down. #88532 (comment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But then this doesn't test the project deletion at all?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does. If projects are deleted, the project overview page will show the message "Remain Calm" instead of any projects. I'm testing what users would see in this case without querying the database. I could also add a verify_existing_projects check at the end. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, please, that would be great!
If we know that "Remain Calm" is showed after the project is deleted, we can then do 1 query to the table and make sure that the project is gone
…etproof-critical-actions
…cal-actions' of github.com:getsentry/sentry into priscila/feat/onboarding/add-tests-to-bulletproof-critical-actions
self.browser.back() | ||
self.browser.get("/organizations/%s/projects/" % self.org.slug) | ||
self.browser.wait_until(xpath='//h1[text()="Remain Calm"]') | ||
assert_existing_projects_status(self.org, [], [project1.id, project2.id]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a nit, but this is hard to read for someone who is not aware what assert_existing_projects_status
does, nicer way to write this is by using keyword arguments, then it is immediately clear what we are testing with this function:
assert_existing_projects_status(self.org, active_project_ids=[], deleted_project_ids=[project1.id, project2.id])
…88532) Add more acceptance tests to ensure critical workflows in the onboarding don't break