
Follow ZDNET: Add us as a preferred source on Google.
ZDNET’s key takeaways
- A Microsoft study aimed to understand the when and how of AI use.
- It assessed 37.5 million anonymized Copilot conversations.
- An influx of personal conversations may not be a good thing.
For many people, AI is more than a means of quickly retrieving information; it’s also become a personal health coach, tutor, confidant, companion — even a therapist. But what are the factors that determine which role the technology plays from one moment to the next?
Also: Gemini vs. Copilot: I tested the AI tools on 7 everyday tasks, and it wasn’t even close
This is the question that Microsoft set out to answer in a recent study, which analyzed 37.5 million anonymized user conversations with Copilot, the company’s flagship AI chatbot. The results, published on Wednesday, reveal that people’s use of AI fluctuates widely depending on the time (it looked at time across the days, months, and year), with stark differences in the types of questions being asked on desktop versus mobile; most notably, users of the latter are asking for more personal advice.
The study sheds a brighter light on some of the more intimate uses of AI chatbots at a time of fierce debate over how closely these tools should be integrated into our day-to-day lives, and the risks associated with our individual use of the technology.
The findings
Previous research has shown that as AI chatbots become more advanced, they’re responding to an increasingly wide variety of queries. A study conducted by OpenAI in September, for example, found that 70% of all ChatGPT messages are non-work-related (up from 53% last year), with “practical guidance” being among the most common uses (along with “seeking information” and “writing”). An article published by “Harvard Business Review” in April claimed that therapy and companionship was the number one most common use of AI.
Also: Using AI for therapy? Don’t – it’s bad for your mental health, APA warns
The new Microsoft study wanted to dig deeper: “While we have an understanding of ‘what’ people do with AI, we know less about when and how they do it,” the company wrote in its full report. The company collected its database of millions of conversations between January and September, and excluded any chats from enterprise or commercial Copilot accounts.
One of the most notable findings was the prevalence of conversations related to health and fitness, particularly on mobile: It was the third most common topic after “technology” and “work and career,” highlighting “a growing user trust in Copilot, as individuals increasingly view it not only as a source of information but as a reliable source of advice,” the researchers wrote in the report.
Conversations were also found to vary over time. On desktop, “work and career” — as you could probably guess — was the most common subject during the workday (8 a.m. to 5 p.m.), while users across both modalities seemed to grow more introspective late in the night: the researchers reported a spike in “religion and philosophy” during the wee hours. Conversations regarding “personal growth and wellness” and “relationships” also surged in February in the days leading up to Valentine’s Day, and on the holiday itself.
Also: Microsoft gives Copilot a ‘real talk’ upgrade – and an (optional) cartoon face
The biggest difference between the two modalities, according to the report, is that desktop users were more focused on career-related queries, while mobile users asked more personal questions.
The Microsoft researchers note in their report that this could cause a split in the future development of AI products: desktop agents that are built to “optimize for information density and workflow execution,” on the one hand, and mobile agents that “prioritize empathy, brevity, and personal guidance” on the other.
Zooming out
The study reveals a relationship between humans and AI that’s multifaceted and nuanced, according to Microsoft. “By disentangling seasonality, daily rhythms, and device-level differences, we move beyond the monolithic view of ‘AI usage’ to reveal a technology that has integrated into the full texture of human life,” the company wrote in its report.
The company obviously has good reason to paint this as a good thing for individuals and society at large; the more personal and work-related conversations people have with Microsoft’s chatbot, the more effectively it will be able to craft its outputs to keep users engaged, giving it a sharper edge in its competition with other AI industry giants like Google, Anthropic, and Amazon.
Also: FTC scrutinizes OpenAI, Meta, and others on AI companion safety for kids
It’s by no means clear, however, that a growing reliance on fallible chatbots for personal matters, such as health and relationships, is in our best interest. Some companies, like xAI and Meta, have actively leaned into so-called AI “companions” — virtual avatars attached to large language models which can build fine-grained profiles of users over time — as a means of commercializing AI, which could pose especially big risks for underage users.
Still, while we should all be cautious about the personal information we choose to disclose to AI tools, and about the accuracy of the advice they give us in return, Microsoft’s new study helps to reveal the fact that these systems are playing an increasingly central and influential role in the world — for better or worse.
Source link
#Coworker #friend #chatbots #role #shaped #device #time









