Clippy, the animated paper clip that annoyed Microsoft Office users nearly three decades ago, might have just been ahead of its time.
Microsoft introduced a new artificial intelligence character called Mico (pronounced MEE'koh) this past week, a floating cartoon face shaped like a blob or flame that will embody the software giant'sย Copilot virtual assistantย and marks the latest attempt by tech companies to imbue their AI chatbots with more of a personality.
Copilot's cute new emoji-like exterior comes asย AI developers face a crossroads in how they present theirย increasingly capable chatbotsย to consumersย without causing harmย or backlash. Some have opted for faceless symbols, others like Elon Musk's xAI are selling flirtatious, human-like avatars and Microsoft is looking for a middle ground that's friendly without being obsequious.
โWhen you talk about something sad, you can see Micoโs face change. You can see it dance around and move as it gets excited with you,โ said Jacob Andreou, corporate vice president of product and growth for Microsoft AI, in an interview with The Associated Press. โItโs in this effort of really landing this AI companion that you can really feel.โ
In the U.S. only so far, Copilot users on laptops and phone apps can speak to Mico, which changes colors, spins around and wears glasses when in โstudyโ mode. It's also easy to shut off, which is a big difference from Microsoft's Clippit, better known as Clippy and infamous for its persistence in offering advice on word processing tools when it first appeared on desktop screens in 1997.
โIt was not well-attuned to user needs at the time,โ said Bryan Reimer, a research scientist at the Massachusetts Institute of Technology. โMicrosoft pushed it, we resisted it and they got rid of it. I think weโre much more ready for things like that today.โ
Reimer, co-author of a new book called โHow to Make AI Useful,โ said AI developers are balancing how much personality to give AI assistants based on who their expected users are.
Tech-savvy adopters of advanced AI coding tools may want it to โact much more like a machine because at the back end they know itโs a machine,โ Reimer said. โBut individuals who are not as trustful in a machine are going to be best supported โ not replaced โ by technology that feels a little more like a human.โ
Microsoft, a provider of work productivity tools that is far less reliant on digital advertising revenue than its Big Tech competitors, also has less incentive to make its AI companion overly engaging in a way that's been tied to social isolation, harmful misinformation and, in some cases, suicides.
Andreou said Microsoft has watched as some AI developers veered away from โgiving AI any sort of embodiment,โ while others are moving in the opposite direction in enabling AI girlfriends.
โThose two paths donโt really resonate with us that much,โ he said.
Andreou said the companion's design is meant to be โgenuinely usefulโ and not so validating that it would โtell us exactly what we want to hear, confirm biases we already have, or even suck you in from a time-spent perspective and just try to kind of monopolize and deepen the session and increase the time youโre spending with these systems.โ
โBeing sycophantic โ short-term, maybe โ has a user respond more favorably,โ Andreou said. โBut long term, itโs actually not moving that person closer to their goals.โ
Microsoft's product releases Thursday included a new option to invite Copilot into a group chat, an idea that resembles how AI has been integrated into social media platforms like Snapchat, where Andreou used to work, or Meta's WhatsApp and Instagram. But Andreou said those interactions have often involved bringing in AI as a joke to โtroll your friends,โ in contrast to Microsoft's designs for an โintensely collaborativeโ AI-assisted workplace.
Microsoft's audience includes kids, as part of its longtime competition with Google and other tech companies to supply itsย technology to classrooms. Microsoft also Thursday added a feature to turn Copilot into a โvoice-enabled, Socratic tutorโ that guides students through concepts they're studying.
Aย growing number of kids use AI chatbotsย for everything โ homework help, personal advice, emotional support and everyday decision-making.
The Federal Trade Commissionย launched an inquiryย last month into several social media and AI companies โ Microsoft wasn't one of them โ about the potential harms to children and teenagers who use their AI chatbots as companions.
Thatโs after some chatbots have been shown to giveย kids dangerous adviceย about topics such as drugs, alcohol and eating disorders, or engaged in sexual conversations with them.ย Families of teen boysย who died by suicide after lengthy chatbot interactions have filed wrongful death lawsuits against Character.AI and ChatGPT maker OpenAI.
OpenAI CEO Sam Altman recently promised โa new version of ChatGPTโ coming this fall that restores some of the personality lost when it introduced a new version in August. He said the company temporarily halted some behaviors because โwe were being careful with mental health issuesโ that he suggested have now been fixed.
โIf you want your ChatGPT to respond in a very human-like way, or use a ton of emoji, or act like a friend, ChatGPT should do it,โ Altman said on X. (In the same post, he also said OpenAI will later enable ChatGPT to engage in โerotica for verified adults,โ which got more attention.)



