spekulatius
Repos
45
Followers
387
Following
1275

A universal web-util for PHP.

343
58

Awesome FilamentPHP stuff

21
2

An awesome list covering PHP scrapers, spiders and crawlers

4
0

A toolkit for Spatie's Crawler and Laravel.

16
6

Web stuff I've discovered and liked. My public notes.

4
0

Adds the web-monetization metatag to your VuePress website

14
0

Events

Created at 8 hours ago
Created at 1 day ago
Created at 1 day ago
Created at 1 day ago
Created at 2 days ago
Created at 2 days ago
Created at 2 days ago
Created at 2 days ago
spekulatius delete branch dependabot/npm_and_yarn/ua-parser-js-0.7.33
Created at 2 days ago

Bump ua-parser-js from 0.7.28 to 0.7.33

Bumps ua-parser-js from 0.7.28 to 0.7.33.


updated-dependencies:

  • dependency-name: ua-parser-js dependency-type: indirect ...

Signed-off-by: dependabot[bot] support@github.com

Created at 2 days ago
pull request closed
Bump ua-parser-js from 0.7.28 to 0.7.33

Bumps ua-parser-js from 0.7.28 to 0.7.33.

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
  • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
  • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
  • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
  • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

You can disable automated security fix PRs for this repo from the Security Alerts page.

Created at 2 days ago
Created at 3 days ago
Created at 3 days ago
Created at 1 week ago
Created at 1 week ago
Created at 1 week ago
Update Docs of README.md

Hey @priyankarpal

yeah, that's something we are going to fix in the near future :+1:

Cheers

Created at 1 week ago
.edu
Created at 1 week ago
issue comment
[Request] Add robots.txt parsing

Fair enough, that's definitely another use-case. I'll see how we can get both working

On Thu, Jan 12, 2023, 15:58 Joshua Dickerson @.***> wrote:

Personally, I am looking for sitemaps declared in robots.txt but I think there's also value in checking for rules for crawling.

— Reply to this email directly, view it on GitHub https://github.com/spekulatius/PHPScraper/issues/177#issuecomment-1380502870, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACAK7M45YFZADMOUK6LOEHLWSALZZANCNFSM6AAAAAATW5RGTE . You are receiving this because you commented.Message ID: @.***>

Created at 2 weeks ago
issue comment
[Request] Sitemap Index Files

Hey @joshua-bn,

Can you share the URL you have tried?

Cheers

Created at 2 weeks ago
issue comment
[Request] Add robots.txt parsing

Yeah, that's something to consider. I would opt for https://github.com/spatie/robots-txt instead as it's better maintained. What exactly do you want to achieve with the information?

Created at 2 weeks ago
spekulatius delete branch upgrade-node
Created at 2 weeks ago
pull request closed
Upgrade node
Created at 2 weeks ago

Bump luxon from 1.25.0 to 1.28.1

Bumps luxon from 1.25.0 to 1.28.1.


updated-dependencies:

  • dependency-name: luxon dependency-type: direct:production ...

Signed-off-by: dependabot[bot] support@github.com

Created at 2 weeks ago
pull request closed
Bump luxon from 1.25.0 to 1.28.1

Bumps luxon from 1.25.0 to 1.28.1.

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
  • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
  • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
  • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
  • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

You can disable automated security fix PRs for this repo from the Security Alerts page.

Created at 2 weeks ago
started
Created at 2 weeks ago
pull request opened
Upgrade node
Created at 2 weeks ago
spekulatius create branch upgrade-node
Created at 2 weeks ago
Created at 3 weeks ago