The Consent Gap: Why a Silent 4GB Write Lands Differently in Brussels Than in Mountain View
The core legal accusation is not that Gemini Nano is dangerous, but that Chrome wrote it to disk without ever asking. Alexander Hanff's audit, reproduced by gHacks, Cybernews and Tom's Hardware, found a fresh Chrome profile materialize a 4GB weights.bin inside OptGuideOnDeviceModel in roughly fourteen and a half minutes, with no dialog, no toast, and no entry in the new-tab UI. Under the EU ePrivacy Directive's Article 5(3), storing or accessing information on a user's terminal equipment requires prior informed consent unless strictly necessary to deliver a service the user explicitly requested. Hanff's argument is that no Chrome user ever asked for a local LLM, and the visible AI Mode pill in the address bar actually routes to Google's cloud — meaning the local 4GB file is not strictly necessary even for the AI feature most users notice.
Google's response, given to Decrypt and Gizmodo, sidesteps the consent question and reframes the model as infrastructure: a lightweight on-device engine for scam detection and developer APIs that keeps data off the cloud. That framing is plausible from a product viewpoint, but PCWorld's editorial captures the disconnect: even granting that on-device inference can be privacy-positive, depositing 4GB on someone's drive without a dialog 'and he has a point.' The February 2026 opt-out setting is real, but it is precisely an opt-out — under EU rules, the inversion of consent is the entire problem, not the fix.




