szmarczak 2 minutes ago

Note: I have been professionally working on web scraping.

> SHA-256 > 500ms CPU time

I wonder if the author has calculated the total wasted CPU power for 1M users and the added latency.

> Mouse trajectory

Can be recorded, and there are users that prefer the keyboard and use remote connections. Also, the trajectory can be calculated (did that myself) and is not easily verifiable due to each mouse having different starting accuracy and then catching up.

> Micro-tremor detection

Not detectable on low DPI mouse. Also there are people who are using 125Hz Bluetooth mouse.

> Click precision

What if there is no click? There are keyboard users and you'd be making them angry if blocking them solely on the no-click basis.

> Pre-click

Can be recorded and can be calculated.

> Overshoot

Overshoot only applies to small sized elements. What I noticed, I don't care about overshoot when clicking the Claudflare captcha because the checkbox is big enough for me to hit every single time (from a distance of half-page).

> Web driver

Can be spoofed or using a build of Chromium that does not set webdriver to true.

> Canvas / WebGL / Audio fingerprint > Plugin and browser feature checks

It's testing if it's a real browser, not an automated one.

> Proof of Work timing

What does this solve?

> Interaction timing patterns > Event sequence analysis

Theoretically it's smart, practically it's bad due to the randomness.

> Programmatic form.submit() detection

Events can be made trusted by setting userGesture to true in CDP or running a browser that sets it to true regardless.

> Page load to interaction timing

What does this solve? It's very random.

> Time from page load to submission

Very unreliable due to the randomness.

> Textarea keyboard analysis

Can be recorded.

> Headless browser indicators

I wonder what that is. I hope it's not checking console/storage size.