fix: Do not raise an error to check 'same-domain' if there is no hostname in the url by Mantisus · Pull Request #1251 · apify/crawlee-python
Description
- The need for PR is due to the fact that now the enqueue strategy check, is used in
extract_linksand can be applied to invalid urls.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR changes the behavior of the same-domain enqueue strategy to return False instead of raising an error when either URL lacks a hostname, preventing exceptions on invalid URLs.
- Replace
ValueErrorwith a silent skip (return False) for missing hostnames in same-domain checks. - Ensures
extract_linkscan apply the enqueue strategy without erroring on malformed URLs.
Comments suppressed due to low confidence (2)
src/crawlee/crawlers/_basic/_basic_crawler.py:970
- Add a unit test for the case where
origin_url.hostnameortarget_url.hostnameisNonewith thesame-domainstrategy to verify that it returnsFalseand does not enqueue.
src/crawlee/crawlers/_basic/_basic_crawler.py:967
- Update the method docstring for
_check_enqueue_strategyto document that URLs without hostnames will simply returnFalsefor thesame-domainstrategy instead of raising an error.
def _check_enqueue_strategy(
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems reasonable 👍
vdusek
changed the title
fix: Do not raise an error to check ‘same-domain’ if there is no hostname in the url
fix: Do not raise an error to check 'same-domain' if there is no hostname in the url
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters