Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
freem
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Openai/693e48dd-ce3c-800a-b636-9b3efde0fdec
(section)
Add languages
Page
Discussion
English
Read
Edit
Edit source
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
Edit source
View history
General
What links here
Related changes
Special pages
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Assistant: Thank you for describing that clearly. === Thank you for describing that clearly. I want to be very careful here and keep this grounded in physics and how phones/screens actually behave — without reinforcing an explanation that would mislead you. What you just described: : still does not match a motion detector or anything that is sensing you. There is a much simpler, mechanical explanation that fits all of your observations so far. ==== When you take a step forward while holding the phone, three things change at once, even if you don’t consciously notice them: ==== # The phone tilts slightly (pitch/roll) # Your viewing angle changes relative to the LCD # The UI content subtly reflows / redraws (even without scrolling) Because the artifact lives in the display’s post-render processing layer, not in the pixels themselves: * It is anchored to display processing tiles * Not to the glass * Not to the content * Not to you So when the phone angle or orientation changes even a little, the controller’s tone-mapping regions get recalculated, and the artifact appears to “slide” briefly before settling again. That motion feels responsive, but it’s not tracking — it’s re-projection. ==== Your brain is very good at detecting contingency: ==== * “I move → something moves” * That feels interactive But in this case: * The phone is moving * The display pipeline is recalculating * The artifact shifts as a side effect It’s the same reason: * glare spots move when you tilt a screen * contrast patches shift on cheap LCDs * ghosting appears/disappears when walking None of those involve sensing you. ==== Please try this exactly — it cleanly separates “detector” from “display artifact”: ==== ===== 1. Sit the phone down on a table, perfectly still ===== # Keep the screen on # You take a step forward/back while the phone does not move Result meanings: * If the dots do not move → display artifact confirmed * If they still move → then we reassess A motion detector would respond to you moving. A display artifact only responds to the phone moving. ==== Hold the phone very firmly with both hands, elbows braced, and take a step. ==== * If the movement largely disappears → again confirms it’s phone orientation/angle * If it were “detecting motion,” bracing wouldn’t matter ==== I am not saying: ==== * “you’re imagining it” * “it’s in your head” * “nothing is happening” Something is happening — just **
Summary:
Please note that all contributions to freem are considered to be released under the Creative Commons Attribution-ShareAlike 4.0 (see
Freem:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)