According to The Guardian’s March 16 reporting, Google has removed “What People Suggest” from its search results. The feature had crowdsourced amateur medical advice into AI-powered health information panels.
Google’s stated reason, per The Guardian, is “broader simplification” of its search page. The company did not attribute the removal to safety concerns or regulatory pressure. That characterization belongs to Google. It’s worth noting on its own terms.
The timing is harder to dismiss. The removal follows a previous Guardian investigation into false and potentially harmful outputs from Google’s AI Overviews product. That investigation documented specific cases where AI-generated health information was inaccurate. The feature removed today incorporated crowdsourced suggestions into those AI-generated outputs, a design choice that compounded the accuracy exposure.
No regulatory enforcement action has been confirmed in connection with this removal. The story belongs in the regulation pillar not because a regulator acted, but because it reflects the kind of self-governance response that tends to precede regulatory action. Platforms that move before enforcement generally do so either because internal signals showed a problem or because the reputational risk of waiting became unacceptable.
Which of those drove this decision isn’t confirmed. What’s confirmed, per a single Guardian source, is that the feature is gone.
[SINGLE-SOURCE NOTE: This item is sourced to The Guardian only. Independent corroboration was not available at publication. Core claims should be treated accordingly.]