Linux Foundation’s Open Source R&D Worth $5B

The Linux Foundation on Wednesday released a white paper that puts the estimated value of development R&D costs of its Collaborative Projects at US$5 billion.

The Linux Foundation has provided independent funding for the collaborative software projects since 2008 to fuel innovation across industries and ecosystems. More than 500 companies and thousands of developers from around the world contribute to these open source software projects.

The monetary value of the foundation’s R&D efforts did not come into focus until earlier this year, according to Amanda McPherson, chief marketing officer for the Linux Foundation.

“We have been focusing more on hosting the projects, I would say, instead of analyzing them,” she told LinuxInsider.

World-Changing Reach

The white paper, “A $5 Billion Value: Estimating the Total Development Cost of Linux Foundation’s Collaborative Projects,” offers a state-of-the-industry assessment that open source is changing the world in which we live, according coauthors McPherson and Jeff Licquia, a software engineer at the Linux Foundation.

“Over the last few years every major technology category has been taken over by open source,” said McPherson, “and so much opinion has been shared about the proliferation of open source projects — but not about the value.”

The model for building the world’s most important technologies evolved from the old build-vs.- buy dichotomy, she noted, so it is important to understand the economic value of the current development model. That is one of the primary goals of the report.

Research 101

The report’s findings are based on David A. Wheeler’s Constructive Cost Model, or COCOMO, an algorithmic software cost estimation model established in 2002. It uses a basic regression formula laced with parameters derived from historical project data, as well as both current and future project characteristics.

Wheeler’s initial study became a well-regarded assessment of the value of a Linux distribution. The Linux Foundation performed a similar assessment in 2008.

It evaluates the Software Lines of Code (SLOC) in a project and the estimated person years and development costs associated to produce a value of the development costs.

This report is the first attempt to estimate not only the cost of developing the technologies, but also the value they collectively deliver to the industry.

Big Impact

The report lends credence to the unquestionable growth and importance of open source in today’s world. A side question is how big open source is now compared to years ago, noted Al Hilwa, program director for software development research at IDC.

“It is absolutely bigger. We are seeing some of the big companies like Microsoft and VMware pivot towards open source. It is a force to be reckoned with,” he told LinuxInsider.

Not a day passes that Hilwa does not see some new DevOp startup that is building a model out of open source code. “There is enormous activity around open source code today.”

R&D Findings

Using Wheeler’s model, the report authors made some key findings:

  • The total lines of source code present today in Linux Foundation’s Collaborative Projects are 115,013,302.
  • The estimated, total amount of effort required to retrace the steps of collaborative development for these projects is 41,192.25 person years.
  • In other words, it would take 1,356 developers 30 years to recreate the code bases present in Linux Foundation’s current Collaborative Projects.
  • The total economic value of this work is estimated to be more than $5 billion dollars.

“The results meet the Linux Foundation’s expectations. We knew it would be a big number. I should note we were much more conservative in our assumptions than previous analyses by us and others of Linux,” said McPherson.

What’s Included

The current Linux Foundation Collaborative Projects include AllSeen Alliance, Automotive Grade Linux, Cloud Foundry Foundation, Cloud Native Computing Foundation, Code Aurora Forum, Core Infrastructure Initiative and Dronecode.

The list continues: IO Visor, IoTivity, Kinetic Open Storage Project, Let’s Encrypt, Node.js Foundation, Open Container Project and Open Mainframe Project.

More projects: OPNFV, Open Virtualization Alliance, OpenDaylight, openMAMA, R Consortium, Tizen, Xen Project and Yocto Project.

New Collaborative Projects announced this week include ODPi, ON.Labs and the Open API Initiative. Not all projects were included in the analysis due to a number of them just having become LF Collaborative Projects.

Study’s Impact

The white paper emphasizes how long it would take to recreate the code with a large development team rather than by using open source methods. Open source has passed the point of no return, said McPherson.

“You just won’t see single organizations trying to shoulder the burden of developing complex infrastructure on their own,” she said.

Technical executives may benefit more from the report than developers, as it gives executives who are evaluating their open source strategies fuel for decision-making.

“It shows how valuable the projects are to use as part of a technology strategy,” said McPherson. “It also shows a really interesting progression of open source adoption from people using open source, to open sourcing key technology, to now collaborating together with their peers and competitors on these large scale projects.”

Winning Factors

Software is “transforming industries like transportation and healthcare,” said McPherson. At the same time, the software industry itself is undergoing a massive shift.

Services and speed to market are key. So is managing the complexities inherent in deploying all those billions of code lines.

“Open source is the keystone of both of these shifts. The value shown by our report and the commercial adoption of this code paints a clear and compelling vision of the future for open source,” McPherson said.

Scruffy Days Are Over

“Showing the total economic value of free/libre and open source software helps move from the perception of free software being community theater, and clearly shows it is professional,” said Todd Weaver, CEO of Purism Computer.

“The benefit for software developers is that they can point to cash value for their software released under free licenses,” he told LinuxInsider, “but the largest benefit is to those on the fence about free/libre and open source software, because average users realize that the quality is on par with, or in some cases superior to, the proprietary counterparts.”

Jack M. Germain has been writing about computer technology since the early days of the Apple II and the PC. He still has his original IBM PC-Jr and a few other legacy DOS and Windows boxes. He left shareware programs behind for the open source world of the Linux desktop. He runs several versions of Windows and Linux OSes and often cannot decide whether to grab his tablet, netbook or Android smartphone instead of using his desktop or laptop gear. You can connect with him onGoogle+.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories

E-Commerce Times Channels

Why the Real Estate Industry Should Embrace the Cloud

The increased adoption of cloud computing over the past decade has enabled businesses across industries to meet their growing technology needs while efficiently gaining access to exciting new tools.

However, not every industry has kept up with the evolution of cloud technologies brought forth by digital transformation. A prime example is the real estate industry. Overall, the real estate sector has been slow to digitize operations and move to the cloud; leaving agents, brokers and their clients underserved.

Cloud computing can cover a lot of ground, with both infrastructure-as-a-service and software-as-a service availability. There is great potential for the real estate industry’s future in both areas.

When properly implemented, cloud computing accelerates the innovation and digitization of real estate services, bringing new apps and tools to the market more quickly. This also adds even more value to the buying and selling experience for agents, brokers and consumers alike.

While the cloud offers much potential for the real estate industry, it is important for companies to have an informed idea of what they want to accomplish before moving some or all their IT functions to the cloud. Don’t just jump on the cloud bandwagon; instead, determine what goals you want to achieve by moving to the cloud and develop a plan for an orderly transition.

If a company’s cloud infrastructure ends up looking exactly like its previous on-premises setup, it’s probably not taking advantage of all the benefits the cloud can offer. Real estate companies moving to the cloud need to think strategically about adding value through the transition.

With that caveat, there are tremendous benefits for real estate companies that move to the cloud.

More Data, More Power

A seemingly immense obstacle real estate companies face is the daunting task of implementing cloud-supportive infrastructure. But the truth is that real estate companies don’t have to plan, build, or operate their own data centers.

Instead, the cloud infrastructure providers can set up and maintain the infrastructure while real estate companies focus on what they do best: selling properties, serving customers, and equipping agents and brokers with the best tools to help them do their jobs.

Cloud infrastructure also offers real estate companies the computing power to run modern tools like data analytics and artificial intelligence. These technologies can help real estate companies find new customers, identify people likely to be interested in buying or selling their homes, and match customers to the best real estate agents to service their needs.

Real estate organizations often have access to huge amounts of market and customer data. However, the sheer volume of data makes it difficult to capitalize on. With cloud computing, real estate companies can gain access to the massive computing power needed to crunch the data, while paying only for the time they use that infrastructure.

Mobility and Disaster Recovery Solutions

Another benefit of storing data in the cloud is that it’s accessible from various devices, which is a boon for the growing mobile workforce. Agents, brokers, and home buyers and sellers are increasingly using smartphones and tablets to get work done remotely. The cloud is much more flexible, accessible, and secure than being tethered to a physical hard drive or on-premises server.

Furthermore, companies that transition to the cloud don’t have to build and maintain a remote disaster recovery site, which can be labor-intensive and time-consuming. Instead, critical data in the cloud automatically fails over to a secondary site in the event of a disaster. All that is required to access data in the cloud — anytime, from multiple devices, anywhere — is a solid internet connection.

Budget-Conscious Security

Major cloud infrastructure providers have a security track record that most real estate companies can’t compete with. They have huge teams of security professionals and the best available security technologies, policies, procedures, and controls to protect the information on their servers and data centers 24/7 with little or no human intervention.

Cloud security measures also support regulatory compliance and establish authentication rules for users and devices. This high level of data security is particularly important in the real estate industry, with customers sharing banking and other personal data during what’s often the largest financial transaction of their lives.

Customers want their real estate transactions to be as secure as possible, and cloud infrastructure providers offer that higher level of protection.

Creating an Open Ecosystem

On the software-as-a-service side, the cloud is the perfect way to host multiple apps and software tools that improve agents and broker productivity. One way to approach this is through the development of a real estate app store that includes a range of software, including CRM tools, lead generation software, open house apps and productivity tools, with everything hosted in the cloud.

In doing so, this creates an open ecosystem, where agents and brokers have a choice of software tools to use, including some apps developed in-house and others from third-party partners. The cloud enables an open ecosystem in which agents and brokers simply decide which apps they want to use from a menu of options available. This provides flexibility while also empowering personal choice and customized solutions for home buying and selling and beyond.

Convenience Is the New Normal

The Covid-19 pandemic has forced real estate companies to conduct more business remotely, with documents shared online. Some firms have been moving a greater number of transaction steps to the virtual realm, using cloud-based services to host and gather documents and collect signatures.

While some customers will continue to demand face-to-face contact with agents and brokers, a significant number will embrace the convenience of a mostly online, cloud-based approach.

The industry is already seeing great benefits from cloud computing. Expect many more advantages to reveal themselves as the industry continues to digitize its operations.

Too often, we see that the failure to innovate today equates to playing catch-up tomorrow. The benefits of cloud technologies for real estate services professionals are clear, and the obstacles of price and infrastructure are entirely surmountable.

Business and information technology leaders in this industry must look beyond outdated legacy systems and begin embracing the cloud — now.

Rizwan Akhtar is executive vice president, chief technology officer of business technology, at Realogy. Akhtar holds an M.S. in Computer Science from the University of South Asia and an MBA from the University of Phoenix.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories