Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(onboarding): Bulletproof critical back actions with new tests #88532

Conversation

priscilawebdev
Copy link
Member

@priscilawebdev priscilawebdev commented Apr 2, 2025

@priscilawebdev priscilawebdev changed the title Priscila/feat/onboarding/add tests to bulletproof critical actions feat(onboarding): Bulletproof critical actions with new tests Apr 2, 2025
@github-actions github-actions bot added the Scope: Backend Automatically applied to PRs that change backend components label Apr 2, 2025
@@ -45,6 +50,22 @@ def assert_status_code(response, minimum: int, maximum: int | None = None):
)


def verify_project_deletion(org: Organization, platform: str):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this test is used in 2x different places, so I aimed to make it reusable. However, I'm unsure if this is the best location for it

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

small nit: but all but one function in that file is called assert_*, to keep consistency can we rename this to assert_project_deletion?

@priscilawebdev priscilawebdev marked this pull request as ready for review April 2, 2025 10:26
@priscilawebdev priscilawebdev requested review from a team and ArthurKnaus April 2, 2025 10:26
@priscilawebdev priscilawebdev changed the title feat(onboarding): Bulletproof critical actions with new tests feat(onboarding): Bulletproof critical back actions with new tests Apr 2, 2025
Copy link

codecov bot commented Apr 2, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

✅ All tests successful. No failed tests found.

Additional details and impacted files
@@            Coverage Diff             @@
##           master   #88532      +/-   ##
==========================================
- Coverage   87.77%   87.76%   -0.02%     
==========================================
  Files       10078    10042      -36     
  Lines      571566   568962    -2604     
  Branches    22372    22235     -137     
==========================================
- Hits       501705   499329    -2376     
+ Misses      69462    69217     -245     
- Partials      399      416      +17     


self.load_project_creation_page()

# Select PHP Laravel platform
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we remove this and the following comments? I dont see the value of having a comment on every line. Thx

Comment on lines 58 to 63
try:
# we need to check for the status here because we do soft deletions
Project.objects.get(organization=org, slug=platform, status=ObjectStatus.ACTIVE)
except ObjectDoesNotExist:
# If the project doesn't exist anymore, it's deleted
return # The project was successfully deleted
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we do soft or hard deletions? or both?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed that we do soft deletion when deleting a project.

# Poll the database to check if the project was deleted
# the timeout is set to 10 seconds
start_time = time.time()
while time.time() - start_time < 10:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we sure this timeout is needed? Asking because it will make our acceptance tests runs even slower.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does take some time for the project to be removed, and we need to wait for that. Do you know of some another way to achieve that?

Copy link
Member

@vgrozdanic vgrozdanic Apr 3, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking at other acceptance tests, can we do similar and wait using self.browser.wait_until_not - the only question is what is the right thing to wait for?

Current approach will produce a lot of unnecessary db calls, even if it is only in test environment, and it will make tests slower than they should be

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The issue is that we silently delete the project, and this change isn't reflected in the DOM, so I can't use self.browser.wait_until_not. I added the database query because I noticed it was used frequently in the acceptance tests, but if that's not the preferred approach, I'm happy to remove it. In the tests I've written, we already navigate to the project overview page and check if the project is rendered or not using self.browser.wait_until. The database query was just an extra measure to be completely certain.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unrelated to this PR, but more to the general UX - do we want to let the user know that we have deleted the project?

Or do we want to do all of this silently without user ever knowing that the project was created and then deleted?

Because this can lead to onbaording task (FIRST_PROJECT) being marked as completed - without user ever being aware that they have created a project

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We want to do it silently without users knowing the project has been deleted. They will still see this info in the audit log, but that is ok - we have discussed this internally.

Because this can lead to onbaording task (FIRST_PROJECT) being marked as completed - without user ever being aware that they have created a project

That's a great point. This scenario could occur if the user:

  • Creates a project
  • Navigates back in the flow
  • Clicks "Skip Onboarding"

While it’s a rare edge case, I’ll bring this up to the team. Thanks for flagging this!

Copy link
Contributor

@ale-cota ale-cota Apr 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but more to the general UX - do we want to let the user know that we have deleted the project?

we used to have this modal before that made it transparent a project had been created and delete and it was kind of awkward. Mostly because for the user it didn't really feel like they had created a project, but just picked a platform on the previous screen. It's really just for technical reasons that we need to already create a project at that stage in order to generate a DSN for the code snippets we show in the next screen. So we decided to remove it.

Because this can lead to onbaording task (FIRST_PROJECT) being marked as completed - without user ever being aware that they have created a project

that is a good observation and I would definitely say it's an unwanted side-effect. My preferred option would be to also revert this if the project is deleted in the onboarding flow, so the task doesn't appear as completed.
IF that's very complicated, then I agree with Pri that the scenario is probably rare... if you do go back, it's likely because you wanted to choose a different platform, so probably you'll still continue with the flow and create another project anyway. Since it's just in this "go back and then skip" scenario, it's probably reasonably rare that we can live with it.

@ArthurKnaus
Copy link
Member

ArthurKnaus commented Apr 3, 2025

I am missing the regression test, we wanted to have:

  1. open onboarding
  2. select platform X
  3. go back
  4. verify project deletion
  5. select platform X again
  6. verify new project was create (this was the thing that was failing)

@ArthurKnaus
Copy link
Member

Content wise it looks good to me 👍
@obostjancic @vgrozdanic feel free to approve if your comments are all resolved.

Comment on lines 42 to 52
def test_project_deletion_on_going_back(self):
self.start_onboarding()
self.click_on_platform("javascript-nextjs")
self.verify_project_creation("javascript-nextjs", "Next.js")
self.browser.click('[aria-label="Back"]')

# Assert no deletion confirm dialog is shown
assert not self.browser.element_exists("[role='dialog']")

# Platform selection step
self.browser.wait_until('[data-test-id="onboarding-step-select-platform"]')

# Select generic platform
self.click_on_platform("javascript-react")
self.verify_project_creation("javascript-react", "React")
self.browser.back()
self.browser.click(xpath='//a[text()="Skip Onboarding"]')
self.browser.get("/organizations/%s/projects/" % self.org.slug)
self.browser.wait_until(xpath='//h1[text()="Remain Calm"]')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where is the project deletion verified here?

I only see that this tests for project creation?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed it, and I am checking only for what we are rendering since you said the function would slow tests down. #88532 (comment)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But then this doesn't test the project deletion at all?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does. If projects are deleted, the project overview page will show the message "Remain Calm" instead of any projects. I'm testing what users would see in this case without querying the database. I could also add a verify_existing_projects check at the end. What do you think?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, please, that would be great!

If we know that "Remain Calm" is showed after the project is deleted, we can then do 1 query to the table and make sure that the project is gone

@priscilawebdev priscilawebdev marked this pull request as ready for review April 7, 2025 06:21
…cal-actions' of github.com:getsentry/sentry into priscila/feat/onboarding/add-tests-to-bulletproof-critical-actions
self.browser.back()
self.browser.get("/organizations/%s/projects/" % self.org.slug)
self.browser.wait_until(xpath='//h1[text()="Remain Calm"]')
assert_existing_projects_status(self.org, [], [project1.id, project2.id])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a nit, but this is hard to read for someone who is not aware what assert_existing_projects_status does, nicer way to write this is by using keyword arguments, then it is immediately clear what we are testing with this function:

assert_existing_projects_status(self.org, active_project_ids=[], deleted_project_ids=[project1.id, project2.id])

@priscilawebdev priscilawebdev merged commit 90c2889 into master Apr 7, 2025
49 checks passed
@priscilawebdev priscilawebdev deleted the priscila/feat/onboarding/add-tests-to-bulletproof-critical-actions branch April 7, 2025 08:29
andrewshie-sentry pushed a commit that referenced this pull request Apr 8, 2025
…88532)

Add more acceptance tests to ensure critical workflows in the onboarding don't break
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Scope: Backend Automatically applied to PRs that change backend components
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Retro: Add acceptance tests for flow that broke in onboarding
5 participants