Kilo launches AIpowered Slack bot that ships code from a chat message
Jan 16, 2026
Kilo Code, the open-source AI coding startup backed by GitLab cofounder Sid Sijbrandij, is launching a Slack integration that allows software engineering teams to execute code changes, debug issues, and push pull requests directly from their team chat — without opening an IDE or switching applicat
ions.The product, called Kilo for Slack, arrives as the AI-assisted coding market heats up with multibillion-dollar acquisitions and funding rounds. But rather than building another siloed coding assistant, Kilo is making a calculated bet: that the future of AI development tools lies not in locking engineers into a single interface, but in embedding AI capabilities into the fragmented workflows where decisions actually happen."Engineering teams don't make decisions in IDE sidebars. They make them in Slack," Scott Breitenother, Kilo Code's co-founder and CEO, said in an interview with VentureBeat. "The Slackbot allows you to do all this — and more — without leaving Slack."The launch also marks a partnership with MiniMax, the Hong Kong-based AI company that recently completed a successful initial public offering. MiniMax's M2.1 model will serve as the default model powering Kilo for Slack — a decision the company frames as a statement about the closing gap between open-weight and proprietary frontier models.How Kilo for Slack turns team conversations into pull requests without leaving the chatThe integration operates on a simple premise: Slack threads often contain the context needed to fix a bug or implement a feature, but that context gets lost the moment a developer switches to their code editor.With Kilo for Slack, users mention @Kilo in a Slack thread, and the bot reads the entire conversation, accesses connected GitHub repositories, and either answers questions about the codebase or creates a branch and submits a pull request.A typical interaction might look like this: A product manager reports a bug in a Slack channel. Engineers discuss potential causes. Instead of someone copying the conversation into their IDE and re-explaining the problem to an AI assistant, a developer simply types: "@Kilo based on this thread, can you implement the fix for the null pointer exception in the Authentication service?"The bot then spins up a cloud agent, reads the thread context, implements the fix, and pushes a pull request — all visible in Slack.The company says the entire process eliminates the need to copy information between apps or jump between windows — developers can trigger complex code changes with nothing more than a single message in Slack.Why Kilo says Cursor and Claude Code fall short when developers need multi-repo contextKilo's launch explicitly positions the product against two leading AI coding tools: Cursor, which raised $2.3 billion at a $29.3 billion valuation in November, and Claude Code, Anthropic's agentic coding tool.Breitenother outlined specific limitations he sees in both products' Slack capabilities."The Cursor Slack integration is configured on a single-repository basis per workspace or channel," he said. "As a result, if a Slack thread references multiple repositories, users need to manually switch or reconfigure the integration to pull in that additional context."On Anthropic's offering, he added: "Claude Code documentation for Slack shows how Claude can be added to a workspace and respond to mentions using the surrounding conversation context. However, it does not describe persistent, multi-turn thread state or task-level continuity across longer workflows. Each interaction is handled based on the context included at the time of the prompt, rather than maintaining an evolving execution state over time."Kilo claims its integration works across multiple repositories simultaneously, maintains conversational context across extended Slack threads, and enables handoffs between Slack, IDEs, cloud agents, and the command-line interface.Kilo picks a Chinese AI company's model as its default—and addresses enterprise security concerns head-onPerhaps the most provocative element of the announcement is Kilo's choice of default model. MiniMax is headquartered in Shanghai and recently went public in Hong Kong — a lineage that may raise eyebrows among enterprise customers wary of sending proprietary code through Chinese infrastructure.Breitenother addressed the concern directly: "MiniMax's recent Hong Kong IPO drew backing from major global institutional investors, including Baillie Gifford, ADIA, GIC, Mirae Asset, Aspex, and EastSpring. This speaks to strong global confidence in models built for global users."He emphasized that MiniMax models are hosted by major U.S.-compliant cloud providers. "MiniMax M2-series are global leading open-source models, and are hosted by many U.S. compliant cloud providers such as AWS Bedrock, Google Vertex and Microsoft AI Foundry," he said. "In fact, MiniMax models were featured by Matt Garman, the AWS CEO, during this year's re:Invent keynote, showing they're ready for enterprise use at scale."The company stresses that Kilo for Slack is fundamentally model-agnostic. "Kilo doesn't force customers into any single model," Breitenother said. "Enterprise customers choose which models they use, where they're hosted, and what fits their security, compliance, and risk requirements. Kilo offers access to more than 500 models, so teams can always choose the right model for the job."The decision to default to M2.1 reflects Kilo's broader thesis about the AI market. According to the company, the performance gap between open-weight and proprietary models has narrowed from 8 percent to 1.7 percent on several key benchmarks. Breitenother clarified that this figure "refers to convergence between open and closed models as measured by the Stanford AI Index using major general benchmarks like HumanEval, MATH, and MMLU, not to any specific agentic coding evaluation."In third-party evaluations, M2.1 has performed competitively. "In LMArena, an open platform for community-driven AI benchmarking, M2.1 achieved a number-four ranking, right after OpenAI, Anthropic, and Google," Breitenother noted. "What this shows is that M2.1 competes with frontier models in real-world coding workflows, as judged directly by developers."What happens to your code when you @mention an AI bot in SlackFor engineering teams evaluating the tool, a critical question is what happens to sensitive code and conversations when routed through the integration.Breitenother walked through the data flow: "When someone mentions @Kilo in Slack, Kilo reads only the content of the Slack thread where it's mentioned, along with basic metadata needed to understand context. It does not have blanket access to a workspace. Access is governed by Slack's standard permission model and the scopes the customer approves during installation."For repository access, he added: "If the request requires code context, Kilo accesses only the GitHub repositories the customer has explicitly connected. It does not index unrelated repos. Permissions mirror the access level granted through GitHub, and Kilo can't see anything the user or workspace hasn't authorized."The company states that data is not used to train models and that output visibility follows existing Slack and GitHub permissions.A particularly thorny question for any AI system that can push code directly to repositories is security. What prevents an AI-generated vulnerability from being merged into production?"Nothing gets merged automatically," Breitenother said. "When the Kilo Slackbot opens a pull request from a Slack thread, it follows the same guardrails teams already rely on today. The PR goes through existing review workflows and approval processes before anything reaches production."He added that Kilo can automatically run its built-in code review feature on AI-generated pull requests, "flagging potential issues or security concerns before it ever reaches a developer for review."The open-source paradox: why Kilo believes giving away its code won't kill the businessKilo Code sits in an increasingly common but still tricky position: the open-source company charging for hosted services. The complete IDE extension is open-source under an Apache 2.0 license, but Kilo for Slack is a paid, hosted product.The obvious question: What stops a well-funded competitor — or even a customer — from forking the code and building their own version?"Forking the code isn't what worries us, because the code itself isn't the hardest part," Breitenother said. "A competitor could fork the repository tomorrow. What they wouldn't get is the infrastructure that safely executes agentic workflows across Slack, GitHub, IDEs, and cloud agents. The experience we've built operating this at scale across many teams and repositories. The trust, integrations, and enterprise-ready controls customers expect out of the box."He drew parallels to other successful open-source companies: "Open core drives adoption and trust, while the hosted product delivers convenience, reliability, and ongoing innovation. Customers aren't paying for access to code. They're paying for a system that works every day, securely, at scale."Inside the $29 billion "vibe coding" market that Kilo wants to disruptKilo enters a market that has attracted extraordinary attention and capital over the past year. The practice of using large language models to write and modify code — popularly known as "vibe coding," a term coined by OpenAI co-founder Andrej Karpathy in February 2025 — has become a central focus of enterprise AI investment.Microsoft CEO Satya Nadella disclosed in April that AI-generated code now accounts for 30 percent of Microsoft's codebase. Google acquired senior employees from AI coding startup Windsurf in a $2.4 billion transaction in July. Cursor's November funding round valued the company at $29.3 billion.Kilo raised $8 million in seed funding in December 2025 from Breakers, Cota Capital, General Catalyst, Quiet Capital, and Tokyo Black. Sijbrandij, who stepped down as GitLab CEO in 2024 to focus on cancer treatment but remains board chair, contributed early capital and remains involved in day-to-day strategy.Asked about non-compete considerations given GitLab's own AI investments, Breitenother was brief: "There are no non-compete issues. Kilo is building a fundamentally different approach to AI coding."Notably, GitLab disclosed in a recent SEC filing that it paid Kilo $1,000 in exchange for a right of first refusal for 10 business days should the startup receive an acquisition proposal before August 2026.When asked to name an enterprise customer using the Slack integration in production, Breitenother declined: "That's not something we can disclose."How a 34-person startup plans to outmaneuver OpenAI and Anthropic in AI codingThe most significant threat to Kilo's position may come not from other startups but from the frontier AI labs themselves. OpenAI and Anthropic are both building deeper integrations for coding workflows, and both have vastly greater resources.Breitenother argued that Kilo's advantage lies in its architecture, not its model performance."We don't think the long-term moat in AI coding is raw compute or who ships a Slack agent first," he said. "OpenAI and Anthropic are world-class model companies, and they'll continue to build impressive capabilities. But Kilo is built around a different thesis: the hard problem isn't generating code, it's integrating AI into real engineering workflows across tools, repos, and environments."He outlined three areas where he believes Kilo can differentiate:"Workflow depth: Kilo is designed to operate across Slack, IDEs, cloud agents, GitHub, and the CLI, with persistent context and execution. Even with OpenAI or Anthropic Slack-native agents, those agents are still fundamentally model-centric. Kilo is workflow-centric.""Model flexibility: We're model-agnostic by design. Teams don't have to bet on one frontier model or vendor roadmap. That's difficult for companies like OpenAI or Anthropic, whose incentives are naturally aligned with driving usage toward their own models first.""Platform neutrality: Kilo isn't trying to pull developers into a closed ecosystem. It fits into the tools teams already use."The future of AI-assisted software development may belong to whoever solves the integration problem firstKilo's launch reflects a maturing phase in the AI coding market. The initial wave of tools focused on proving that large language models could generate useful code. The current wave is about integration — fitting AI capabilities into the messy reality of how software actually gets built.That reality involves context fragmented across Slack threads, GitHub issues, IDE windows, and command-line sessions. It involves teams that use different models for different tasks and organizations with complex compliance requirements around data residency and model providers.Kilo is betting that the winners in this market will not be the companies with the best models, but those that best solve the integration problem — meeting developers in the tools they already use rather than forcing them into new ones.Kilo for Slack is available now for teams with Kilo Code accounts. Users connect their GitHub repositories through Kilo's integrations dashboard, add the Slack integration, and can then mention @Kilo in any channel where the bot has been added. Usage-based pricing matches the rates of whatever model the team selects.Whether a 34-person startup can execute on that vision against competitors with billions in capital remains an open question. But if Breitenother is right that the hard problem in AI coding isn't generating code but integrating into workflows, Kilo may have picked the right fight. After all, the best AI in the world doesn't matter much if developers have to leave the conversation to use it.
...read more
read less