Serdar Yegulalp
Senior Writer

GitHub takes aim at downtime with DGit

news analysis
Apr 5, 20163 mins

DGit uses the sync mechanisms of Git to replicate GitHub's repositories across multiple servers to make the code hosting platform far less prone to downtime

crosshairs
Credit: Thinkstock

GitHub has rolled out a new feature that it claims will make the widely used code hosting platform far less prone to downtime.

Distributed Git (DGit) uses the sync mechanisms of the Git protocol to replicate GitHub’s repositories among three servers. Should one server go offline because of a mishap or for maintenance, traffic can be redirected to the other two.

Using Git as the replication mechanism provides companies with a little more flexibility than simply mirroring blocks between disks, according to GitHub. “Think of the replicas as three loosely coupled Git repositories kept in sync via Git protocols, rather than identical disk images full of repositories,” says the blog post describing the new system. Read operations can be directed to a specific replica if needed, and new replicas of a given repository can be created if a file server has to be taken offline.

Another advantage to using Git is that GitHub understands the protocol, which is heavily optimized for synchronizing between systems. “Why reinvent the wheel when there is a Formula One racing car already available?” says GitHub. Using Git also means the failover process requires less manual intervention, and failover servers are not simply sitting idle; they’re actively used for serving read operations and receiving writes.

fileservers 1 GitHub

GitHub’s new Git-driven synchronization architecture makes it easier for multiple, redundant copies of repositories to be made available. All the replicas are live, and can serve read-only files and accept writes as needed.

The rollout of DGit has been a gradual process. GitHub ported its own repositories first, testing to make sure they still worked correctly, then started moving third-party and public repositories. Next came the busiest and most heavily trafficked repos, “to get as much traffic and as many different usage patterns into DGit as we could.” Currently, 58 percent of all repositories have been moved. The rest are slated to follow “as quickly as we can,” GitHub says, since moving to DGit is “a key foundation that will enable more upcoming innovations.”

The biggest advantage of DGit is less downtime. Even a small amount of GitHub downtime — whether because of disaster or attacks — leaves many projects and organizations temporarily crippled.

Third parties have addressed GitHub downtime with both complementary products, like Anam.io’s repository backup services, and competing products, like the GitLab open source alternative. But for many organizations, it could be easier to turn to GitHub and its increasingly ambitious enterprise solutions to do the heavy lifting.

Serdar Yegulalp

Serdar Yegulalp is a senior writer at InfoWorld. A veteran technology journalist, Serdar has been writing about computers, operating systems, databases, programming, and other information technology topics for 30 years. Before joining InfoWorld in 2013, Serdar wrote for Windows Magazine, InformationWeek, Byte, and a slew of other publications. At InfoWorld, Serdar has covered software development, devops, containerization, machine learning, and artificial intelligence, winning several B2B journalism awards including a 2024 Neal Award and a 2025 Azbee Award for best instructional content and best how-to article, respectively. He currently focuses on software development tools and technologies and major programming languages including Python, Rust, Go, Zig, and Wasm. Tune into his weekly Dev with Serdar videos for programming tips and techniques and close looks at programming libraries and tools.

More from this author