Google Search Central Live NYC 2025 gathered industry leaders, publishers and digital enthusiasts to explore the evolving landscape of search. The event was exceptionally well organised, with over 300 attendees, proving it was a carefully planned conference rather than a side project. Despite a strong security presence, which was understandable given the turnout, the conference offered excellent networking opportunities.
Attendees ranged from small publishers to major names such as DotDash Meredith, NY Post, Daily Mail, Forbes, and Nexstar, along with independent consultants, agencies, developers, regional sites, product managers and more.
Following the Zurich event in December 2024, the New York conference provided a deep dive into how artificial intelligence, spam policies, Google News and search practices are reshaping the digital ecosystem.

Agenda Highlights
The day’s schedule spanned topics such as “How we work on Search & what’s new,”, “How Google fights spam,” and “Making sense of Search data,” with a particular focus on AI’s growing influence.
However, the real standout was the Q&A session, which offered a rare chance to speak directly with key Googlers who answered both pre-submitted and live questions. Those on stage answering these questions included Danny Sullivan, Eric Barbera, John Mueller, Ryan Levering and Daniel Waisberg.
Conversations during the breaks provided an excellent opportunity for more in-depth discussions with Google representatives. Those from where I work had the chance to ask deeper questions on various topics, particularly to Danny Sullivan and John Mueller. These interactions added a valuable, personalised dimension to the event and were extremely valuable.

Q&A
Independent Content Creators

Danny Sullivan was particularly passionate and vocal about the challenges facing independent content creators.
“Is Google aiming for a smaller web that favours major publishers over diverse content sources?”
John Mueller
When John Mueller read this out, Danny responded with a decisive “No.”
Yet, he acknowledged that there has long been a perception that only big brands rank well on Google, leaving smaller, independent sites at a disadvantage.
Danny emphasised that many independent creators – from open web bloggers to niche publishers – feel they do great quality work but are not being properly represented in search results. He stated the thoughts of independent content creators as:
“We’re doing great quality work, and we don’t feel like we are being properly shown in the search results. And if we’re not a big brand then we won’t be successful.”
Danny Sullivan
Google are ‘trying’ and actively having conversations
He went on to explain that there is validity to these concerns and that he had just spent the week before in Zurich reviewing real queries from small creators. Reflecting on that experience, he recalled,
“We were just in Zurich last week, out there looking at real queries from small creators and independent sites, sitting with the search ranking team and going through them. What more could we do here?”
Danny Sullivan
Incremental improvements, rather than a specific “independent sites update”
While Danny made it clear that we should not expect a sudden, dedicated update for independent sites, he noted that incremental improvements are on the way through broader algorithm updates.
He stressed ‘accountability’, saying that if Google claims to have done something to help, it must be prepared to be held accountable.
Danny also challenged the notion that Google’s systems simply reward big brands. He stated,
“Our systems do not sit out there and say, ‘big brand, rank it well.’ It is about recognition. If you are recognised as a brand in your field – big or small – that matters because people then know what your site is about.”
Danny Sullivan
Brand Identity Matters: The Fried Chicken Recipe Example
He urged independent publishers to ensure their brand identity is clear, adding that even among hundreds of thousands of similar pages, clarity helps users understand what they are getting into.
“If you have the 155,000 fried chicken recipes available out there, it’s very difficult to know which ones are truly superior. But if your brand is recognised, that becomes a strong signal in search.”
Danny Sullivan
His passionate soapbox moment underscored the need for greater transparency and support for independent voices in search.
During the event, the conversation spilled over onto Twitter, where I live tweeted its highlights. One of my posts sparked significant discussion, reaching 26.8K views and 19 responses.
Notably, it included replies from Danny Sullivan to several small creators who had been invited to Google’s Web Creator Summit at the end of October last year.
And, fairly extensive responses from Danny over here & 2 more follow ups and responses:
Morgan, I was explaining that sometime people come to a site from search that they’ve never been to before, so it’s good if a site can make it easy for them to understand what a site is about, to recognize it as a brand — whether that’s a big or a small site. Good first…
— Google SearchLiaison (@searchliaison) March 21, 2025
A more in depth take from Nate Hake:
Danny – I have 2 questions for you as follow up to our visit to Google’s HQ in October:
— Nate Hake (@natejhake) March 21, 2025
1) By what *specific date* can we expect Google to do better at surfacing independent sites?
2) When will Google talk with publishers about how we fit into Google’s “AI-first” future?…
Follow up response from Danny:
The first part we already answered last week. It’s what I pointed out from what was shared on LinkedIn:
— Google SearchLiaison (@searchliaison) March 21, 2025
“We also continue our work to surface more content from creators through a series of improvements throughout this year. Some have already happened; additional ones will come…
Danny’s message was unmistakable: while independent creators should not expect a dramatic overhaul, Google is committed to making incremental improvements that aim to level the playing field and offer greater support for smaller voices.
Site Reputation Abuse & Freelancers

Not a Freelancer Problem, according to Google
Danny Sullivan spent considerable time clarifying what Google means by “site reputation abuse” and emphasised that having freelance or affiliate content is not inherently problematic. The focus is on preventing an unfair advantage.
“In fact, we even have a thing that explicitly says it is not a problem to have freelance content on its own. So spread the word to your freelance people. ‘No, we don’t hate freelance content.’ I have been a freelance journalist. I have employed freelance journalists. Freelance content is not necessarily a bad thing and it could be useful. It has all sorts of circumstances where it makes sense.”
Danny Sullivan
Freelancers and Affiliate Content
Danny stressed that Google does not penalise publishers simply because they hire freelancers or include affiliate links:
“We don’t go, ‘That’s affiliate.’ Pages have affiliate links all over the place. We would be like banning so much of the web. It’s not an issue, it’s not a thing.”
Danny Sullivan
Reputation Signals
The real concern arises when third party content exploits a site’s existing reputation signals, which were earned by first party work:
“SRA is not about the quality of the content. Just because the content that the third party created is good, it doesn’t change the practice of getting an advantage. That same piece is getting an ‘unfair boost’ because it was put on a site for its first party work, and now you have brought in a third party person, sometimes to broaden it out into a whole new area that you didn’t even have a reputation in before. And that’s the whole issue.”
Danny Sullivan
Sullivan also clarified that Google does not maintain any “list of freelancers” who trigger manual actions:
“We didn’t issue manual actions because Site A hired Freelancer B, and we don’t like Freelancer B. There is no list of freelancers that cause a site to get a manual action.”
This means that if a publication employs freelance work within the general policy framework, there is no need to worry about past affiliations or individual reputations.
This was confirmed when we spoke to Danny in a small group about re-hiring freelancers as full-time staff rather than relying solely on freelance arrangements.
Danny further pointed out that even if third-party content is of high quality, it should not receive an “unfair boost” simply by being placed on a site that has built a strong first-party reputation. He explained:
“Just because the content that the third party created is good, it doesn’t change the practice of getting an advantage.”
Danny Sullivan
Site Reputation Explained in Plain Language
I’ve tried to simplify this explanation below, as the discussion of “first party” content above can sometimes come across as a bit cryptic:
A website that has built a strong reputation in one area may accidentally benefit when it starts covering new topics. Because search engines trust the site based on its previous work, any content published in a different category automatically inherits some of that established authority. In effect, the site does not have to start from scratch to be seen as a reliable source in a new subject; its past success carries over from different areas.
My own interpretation to simplify this
It was made clear that the focus of site reputation abuse is on preventing publishers from exploiting their hard-earned reputation to boost third-party content. Freelance work, affiliate links, and similar contributions are acceptable as long as they do not distort the established ranking signals of a site.

However, the focus on affiliate content raises concerns, as only affiliate sections of sites seem to be targeted by Site Reputation Abuse while non-affiliate sections with third-party content remain unaffected. In our experience, the removal of penalties has not always followed the stated guidelines, indicating that further scrutiny may be required to ensure consistency. It is also possible that the policy and its underlying rationale have evolved since its enactment on 19 November 2024 [updated twice December 6, 2024, January 21, 2025].

One thing is clear: Google never intended for this policy to negatively impact freelancers, especially not in the way critics claim has happened due to the initial ambiguous language in its policy documents, and I would argue in its follow up in enforcing the policy across a number of websites I’ve been tracking against.
Artificial Intelligence

The AI discussion offered deep insights into how generative AI is being integrated into search to make it more effortless and personalised.
Here’s a summary of the key points, with some notable pull quotes:
AI Overviews and AI Mode
John Mueller explained that AI Overviews act like enhanced featured snippets. Rather than just returning the top 10 blue links, the system “fans out” to capture a range of variations (this is called “query fan-out“), delivering a predictive summary of the query. As he put it:
“When you issue a query in AI Mode, the system does not simply return the top 10 blue links; it fans out to capture a range of variations, delivering a predictive summary of what your query is really about.”
John Mueller
AI Mode, which is still in labs and requires explicit opt-in, aims to provide more detailed, context-rich answers by leveraging advanced query fan-out. This approach, while innovative, builds on decades of AI experience with systems like RankBrain, MUM, and BERT. I managed to get use of this whilst in New York, as AI Mode is only available in the US and you have to opt into it right now as part of the labs opt in feature.
Retrieval Augmented Generation and Grounding

The discussion also covered retrieval augmented generation (RAG). In this process, a query is sent to the search engine, and the resulting authoritative search data is combined with the output from a language model. Grounding then attaches links to actual web pages within the AI-generated response, ensuring credibility.
“Retrieval augmented generation combines search results with language model output to deliver a comprehensive, context-rich answer.”
John Mueller
This process ensures that AI-generated responses include links to actual pages, giving users verifiable sources. Mueller also stressed that if you optimise your site correctly for traditional search, you are automatically setting yourself up for success with these new AI features. He remarked:
“If you’re doing the right thing for search, then you’re automatically doing the right things for AI.”
Evolution, Not Revolution
A key theme was that these AI innovations are evolutionary rather than revolutionary. A Search Product representative noted:
“Our goal is to make search effortless by synthesising multiple related queries into one clear, personalised response.”
This means that while AI Overviews and AI Mode introduce new ways to present information, they rely on the same fundamental processes of crawling and indexing that have long underpinned search. The integration of AI is intended to enhance rather than replace these core systems.
Challenges in Measurement
There was also discussion about the challenges publishers face in measuring the impact of these AI features. I won’t go into too much detail here as I went into detail on this back at the Zurich version of this event in December.
Because the data remains aggregated and dynamic, it is challenging to determine the exact benefit for individual sites.
Discussion on Twitter/X over here.
Fair Value Exchange
A key concern raised at the conference was the impact of these new features on the traditional value exchange between publishers and Google. The longstanding, unofficial deal has been that publishers provide their content to Google in exchange for receiving traffic. However, by delivering answers directly in the search results, AI Overviews and AI Mode may reduce the traffic that sites receive. Google says otherwise, but has yet to publish the data on this, and publishers say they are seeing the opposite.
This shift could distort the value exchange that publishers have relied on for years. As John Mueller explains, if you optimise your site for traditional search, you naturally benefit from these AI features. Yet, if users get the information they need directly in the SERPs, publishers may lose out on valuable click-throughs. This issue was discussed in depth, echoing concerns previously raised about featured snippets and answer boxes.
“If you’re doing the right thing for search, then you’re automatically doing the right things for AI.”
John Mueller
In other words, while these AI enhancements promise a more “effortless search experience”, they also challenge the existing model by potentially reducing the direct traffic that publishers depend on and impact the web ecosystem.
Clear, measurable attribution and fair compensation will be crucial to ensure that the evolving AI-driven landscape remains a win-win for both users and content creators.
Related: Shaping the AI Content Frontier: Deals, Data, and Value Exchange
Google News & Top stories

Google explained that news rankings rely on seven factors: relevance, location, prominence, authoritativeness, freshness, usability, and user interest.
In my view, these factors align with Google’s long-standing priorities, with a strong emphasis on user experience, credibility, and personalised relevance.
Location
Notably, in the US, the “location” factor is often emphasised even for non-local searches. Based on my recent experience, we have seen this emphasis increase significantly throughout 2023 for what were national queries, become more localised down to the state-level.
I suspect this is part of a broader effort to make search results simpler, personalised, and more effortless. AI is likely to play a key role in this shift.
Formats
Google News content now appears in various formats on Google Search, including Top Stories, the News Tab, and the Perspectives carousel that showcases articles, videos, and social media posts.

To help Google better understand page content, publishers are advised to use structured data types such as Article, NewsArticle, BlogPosting, Paywall, and VideoObject.

Key Statistics: 15% New Queries, 7,000 Tests & 4,700 Updates
John Mueller noted,
“Approximately 15% of all search queries are new every day.”
This statistic highlights how dynamic user interests are and underscores the need for continuous innovation. John was surprised by the fact that this % was still continuing to grow. Roger Montti at Search Engine Journal covers this in an article titled “Google Revisits 15% Unseen Queries Statistic In Context Of AI Search“.
Google also revealed that over 7,000 tests are conducted and around 4,700 update launches have been documented in the past year.
These figures reflect the immense scale of work behind refining search algorithms and integrating new AI enhancements.
Final Thoughts
Google Search Central Live NYC 2025 confirmed that while the integration of AI promises a more intuitive and personalised search experience, the core priorities of quality, relevance and user experience remain unchanged.
From AI Overviews and AI Mode to discussions on site reputation abuse and independent content, the conference underscored that Google’s approach is evolutionary, not revolutionary. Incremental improvements are on the way, and the emphasis is on maintaining fairness while adapting to new technologies and regulatory pressures.
Having attended the Zurich events twice (the first ever one I believe in 2019 and more recently in 2024), I felt that the New York edition delivered an even more advanced experience. It covered more ground with detailed responses and deeper insights. The ongoing dialogue between Google and the community ensures that the future of search will continue to be shaped by both innovation and accountability.
Around the Web
- Testing Google’s AI Mode whilst in New York on Twitter/X
- What I learned at the 2025 Google Search Central Event in New York by Lily Ray
- My Takeaways From Google Search Central Live NYC 2025 by Barry Schwartz
- Google Provides Timeline To Improve Publishers’ Search Visibility by Matt G. Southern
Other notable Googlers at the event
Leave a Reply