Just out of curiosity, there are so many linux distributions which are updated yearly, or even every 6 months, which only means the OS is still evolving. When will linux be finalised, and what is linux's objectives as an end result?
-
-
Secondly, the objectives of Linux, like all other operating systems, is to improve usability, performance, stability, security, compatibility, and to still remain a viable platform for emerging applications. Asking when Linux will be finalized is like asking when technology will be finalized. Perhaps the kernel will become obsolete for whatever reason in the future, but that is an unlikely outcome. As new technologies and programs emerge, Linux will just keep plodding on to be a good operating system, just like its competition. -
The Fire Snake Notebook Virtuoso
No OS(Windows or otherwise) is finalized until it is out of support completely, which is what you want. There are always new devices that are coming out that need drivers, mistakes that need to be fixed and bugs that need to be fixed. If you are comparing Linux to Windows, think about all the updates and Service packs that are constantly being pushed out. It's the same concept in Linux....
-
Linux does not use a point based release system anymore. Before, it used even numbers (2.2, 2.4, 2.6) for stable releases and odd numbers (2.1, 2.3, 2.5) for unstable releases.
Starting with 2.6, Linus decided to simply release 2.6.* releases indefinitely, with each release adding one or two significant features. He has no plans to go to 2.7/8 any time soon. -
It is finished; the journey is the destination, in this case (at least, that's my impression of it).
-
Here's a dumb question that I've never really understood.
When a new distro of linux comes out, does one have to reinstall or is it possible to keep upgrading? -
Well, depends.
And if it's possible, it's messy and not recommended. -
It depends on the distribution. Rolling release distros like Arch, Gentoo, Debian, and some others have no problem simply upgrading periodically, and always being up to date. Debian does require a special dist-upgrade option with apt, but it's not much beyond a simple upgrade.
Then there are distributions that follow a more traditional release model, like Slackware, Fedora, and Mandriva. When upgrade mechanisms do exist, it's not typically a good idea to use them.
I've gotten the impression that Ubuntu falls somewhere in between. Upgrades typically go well, but things do go wrong at times, especially on heavily customized systems, and those which have been upgraded previously. -
Debian Testing that is, I don't think Debian Stable would work to well.
-
That's kinda like asking the final form of "car."
-
-
Or so that was the impression I got when Nick was teaching young Dan(iel-son) aka "Jack."
Wait, is that what you said? "The journey IS the destination?"
Even further off topic - "What's your dirt doing in my hole, boy?" -
-
-
It just appears a bit strange that Linux is updated so often which seems to suggest it is unfinished, otherwise why would they need to update so regularly.
I am not a computer expert so sorry if my views are a bit simplistic but it appears that Linux is trying to catch up with modern hardware and software standards. Is that a fair statement?
Also, I dont really understand why there are KDE, GNOME and XFCE differences. In terms of website descriptions it appears that XFCE is the most advanced for being efficient.
Looking forward to your opinions, as I want to learn more about Linux. -
TL;DR: Choice is always good, and there can never be too much of it. -
ALLurGroceries Vegan Vermin Super Moderator
Linux is an operating system kernel, not an entire operating system. It is developed 24/7 by people all over the world.
Then there is operating system software which runs on top of Linux, such as the GNU utilities and toolchain. These tools get bundled with a version of the kernel and are made into distributions.
Distributions are what most people refer to incorrectly when they talk about 'Linux', they are really packages of Linux with GNU utilities and other free/non-free software that is specific to whatever function that particular distribution fulfills.
The reason that everything is always being released in new versions is that each piece of the pie, from the Linux kernel, to the GNU utilities and toolchain, to the distributions, are being constantly updated and revised. This is just the nature of GNU/Linux in specific, and free software ("open source") in general.
What you seem to be doing is confusing Free Software that is developed in an open and collaborative effort with shrinkwrapped products which are entirely proprietary and thus conform to a product release cycle based on planned obsolescence. -
Just my two cents' worth - sorry for butting in if that offends anyone. -
ALLurGroceries Vegan Vermin Super Moderator
Also there are security patch releases for distributions much more than there are Patch Tuesdays.
So I'm not getting your point, or you're wrong. -
Also, a distinction should be made between release candidates, which MS generally does not make available to the general public willy-nilly, and the production release of the ready-for-prime-time version of an OS. As such, the frequent linux rc releases should not be counted against the RTM releases of the various WinOSes. -
ALLurGroceries Vegan Vermin Super Moderator
Thanks for clarifying your statement, but I'm afraid you are mistaken, the release cycle for windows 7 does not even approach that of linux, or free software in general. A corporation cannot hope to attain the economic scale that actively developed software produced under open source licenses does. -
-
ALLurGroceries Vegan Vermin Super Moderator
Here's two quotes which prove my point:
http://en.wikipedia.org/wiki/Source_lines_of_code -
-
ALLurGroceries Vegan Vermin Super Moderator
What we are talking about is why linux gets developed faster than windows. It's because there are more people working on it, since it is decentralized and not under control of one corporation, thus more 'economic scale'. You can look up economic scale and read about it if you are confused about the term... I am trying to (RESPECTFULLY.. not trying to flame here) show you why your original point was wrong because I don't want you to mislead others.
Edit: In the proprietary software industry the scarcity is concentrated in man hours, which is unfortunately a metric which is used to gauge the size and importance of software projects... and what people are paid for as employees (their time, or man hours). In open source quality and functionality of the end product is the scarcity. Thus man hours are not the metric we look to for quality as well documented in the book 'The Mythical Man-Month'. Open source removes the limits of man hours to achieve better scale and allows a more concentrated development effort based on quality (only competition is other open source projects) rather than sales volume. That is why it is developed faster and why I care to make this point... it's an important reason to use free (as in freedom) software. -
Regrettably, notwithstanding the inferred number of people working on the linux project that you draw from the studies you cite, there is no way to adequately determine what the relative economies of scale are for the linux project as a whole and Microsoft, in large part because there is no way to adequately quantify either the economic costs associated with the inputs (time and labor) that go into the linux project, or the economic returns realized by the linux project as a whole. Arguing largely from the a priori definition of "economies of scale" gets you nowhere, because that definition does not provide a deterministic relationship between the number of bodies working on a project and the economies of scale achieved thereby.
To give a simple example, one steam shovel (actually, the modern descendants thereof), such as Big Brutus, with a single operator, a small team of maintainers, and a limited design and construction team, can move more earth than an army of men with shovels could ever hope to move. In that case, the economics of the situation favor the smaller team rather than the larger team because the smaller team has managed to leverage its skills and abilities to a greater degree than the army of men with shovels has.
If all you're trying to point out is that there are a lot more people working on linux than there are working on, say, _Vista, fine, no argument there. However, your implicit assumption that a corporation could never round up the same, or a greater number of people, to do the same thing is neither here nor there.
In the first instance, the legal form of organization of an enterprise, whether it be the development of .Windows within a corporate entity, or the development of linux within a non-legal aggregation of individuals and institutions, has no bearing on the degree to which any particular organization can round up a particular number of people to do a particular activity in a meaningful, useful way.
In point of fact, that neither Microsoft, nor any other software developer that has to sing for its supper (i.e., make a profit on the products it creates) employs as many people as were inferred to be contributing to the linux kernel by the studies you cite, is more an indication of the fact that a development project on the scale (note, not the "economic scale" - a different kettle of fish) of the linux project is, in fact, more an indication that the linux project is not an economically efficient use of resources. If it were, then making the investment needed to round up that many bodies would be economical - i.e., the present value of the total return on the investment would equal or exceed the current cost of investment - and any profit-seeking organization would have made that investment.
Thus, while the linux project may have a greater size than, say, the .Windows project, and thus, to a certain extent, a greater scale than the .Windows project, that fact alone does not justify the conclusion that it has a greater economic scale than the .Windows project.
At any rate, this is all in good fun, but you're clearly in a mood for a fight - thank you for being so restrained so far - so I think I'll leave it at that, and abandon my arguments to their ill-deserved fate. -
ALLurGroceries Vegan Vermin Super Moderator
I am not in the mood for a fight but you do have me thinking. I value your opinion and viewpoint and I'm interested in what we are discussing... Again I'm not trying to flame
You have a good point, if a corporation the size of all the developers of an open source project focused on a single product there is good chance that they'd be able to make a product of similar economic scale (and economic scale is just the underlying concept of 'scales of economies', so yes that is what I do mean). The problem is there is no way to pay that many people and turn a profit, which is at the heart of my argument. At least we're down to a discussion that makes sense because before when you said that windows had a comparable development cycle that made no sense to me.
Edit: On your point about the metrics being unquantifiable, see my edit to post #26, and also realize that is the reason I pasted in the two quotes earlier from the studies, to give context... man hours should not be used as an input but it is industry standard practice, so it gets really dirty arguing with the way things are currently measured and then trying to get relevant data... -
ALLurGroceries Vegan Vermin Super Moderator
Also I can illustrate it this way. You have X many people at microsoft working on one product in one division of the corporation that can be called a somewhat-self encompassing unit. You have X many people working on an open source product in the same market segment directly competing on functionality. After the initial period of heavy development under which the costs could theoretically be equal under the right circumstances (as in your point which I noted above), the cost per unit of software goes down with open source versus proprietary because of the need for maintenance, patching and distribution of new binaries. The cost goes down not only because there are more people working on it after the release (scale in the general sense) since the source code is available and bugs can be found by anyone, but also because there is lower aggregate cost per unit (scale of economy) due to reduced development overhead (man hours are not a constraint) and the nature of open source competition (which I explained in my edit to post #26)
Edit: Also in the case that the initial development costs were to be equal among an equal number of equally capable software engineers, the Microserfs** would truly be paid like serfs and the open source developers would need to be independently rich and frugal, so neither is ideal or even reasonably realistic, but for the sake of argument you have pointed out a very important situation I had not yet considered! Though I think I have proven my point logically now, somewhat with your help. Cheers.
**that is a book reference, not a derisive statement about Microsoft developers... for the record I do my fair share of Win32 coding and feel their pain
Edit # 2: Also on the subject of Microserfs, there is the emotional attachment that people feel to their own work, which is somewhat disconnected with proprietary software/employee relationships, in the open source world there are rarities like OpenBSD*** but in general people try to get along and keep coding. This attachment plays heavily into 'quality' as the whole reward of open source has to do with identification with your code.
*** http://en.wikipedia.org/wiki/OpenBSD#History_and_popularity -
ALLurGroceries Vegan Vermin Super Moderator
Also I'd like to make a more general on-topic point, the 'final form of linux' idea can be maybe equated with the yet-to-materialize 'year of the linux desktop'... any slashdotters in here???
I hate to draw parallels like this because I don't sit on any side of the OS fence (I operate at a much lower level, so to speak lol), but what it seems is needed is the Windows 95 release for linux... it seems like an insurmountable challenge. Rolling Stones royalties for advertising notwithstanding.**
** See: http://en.wikipedia.org/wiki/Windows_95#Final -
The only economic argument I can see going for open source is that it eliminates the need to constantly reinvent the wheel. I find all other economic arguments unconvincing.
-
ALLurGroceries Vegan Vermin Super Moderator
** http://en.wikipedia.org/wiki/Transmeta
Notice their transition from a physical product-based revenue model to an intellectual property licensing one... and their lawsuit with Intel. They faced massive restructuring when their business plan failed, Linus walked, and they had to generate revenue under the load of all their VC debt. This is the nature of capitalism, while neither right nor wrong, there are ways to escape its clutches, and open source goes about as far as you can in that regard. -
-
ALLurGroceries Vegan Vermin Super Moderator
Yeah but I am not debating that point, man hours are just one measure of cost, while they are the industry standard, not ideal. The point I am making is that there is less aggregate cost per unit for open source versus proprietary, whether we factor in man hours as the main input or not. I quoted a few studies on the last page.
http://en.wikipedia.org/wiki/The_Mythical_Man-Month
Edit: There are two separate reasons that man-hours are not ideal: the technical one described in the article above and also that true cost cannot be calculated without other factors, such as advertising, distribution, etc... the former being a disadvantage and the latter an advantage of open source.
DoublEdit: Open source also limits marginal cost... this is why distribution is one of its advantages... it is cheaper to legally borrow your buddy's linux cd for example than to go to the store and buy another copy of windows for your almost-worth-it old PC. This is not the same argument I am trying to make above but it's similar. There is a feedback loop between marginal cost (cost of another unit) and the economic scale (price per unit) of the codebase since more users get turned on to the software and ideally they end up contributing good code. Obviously as you pointed out earlier it is easier to hack than to write solid efficient code so this is highly idealized.
This paper really says it better than I can:
http://www.tcllab.org/virach/paper/virach/JCSSE2008-oss-virach.pdf#page=3 -
This debate has turned into something complicated...
I am surprised that Lemur made the point about developers abandoning projects after finding 'real jobs'. I thought the Linux community consisted of experts with 'real jobs' who want to contribute to a worthy project and view it as a hobby.
I don't know much about the economic theory of economies of scale but I believe there are limitations to the concept in this scenario.
Firstly, only one person can write a particular piece of code at any one time. I understand that a project can be split between many people but it is not usual to have two developers writing the same code at the same time.
Secondly, developers will need to find time from their ordinary lives to make contributions. The may be able rotate within their teams but the same argument applies, someone needs to have free time in order to contribute.
Thirdly, in order to be part of the team they require the necessary expertise to write the particular code, as already mentioned by someone.
In my opinion these factors limit the theory of economies of scale, as more developers do not always translate to faster or better coding. Everything comes down to time contributed on a code and the availability of experts. Therefore, I believe that large corporations which have experts working on their projects for many hours a days are not disadvantaged, if at all, when compared with the Linux community, even if it is huge.
I do believe that something like Linux is an amazing achievement and one day will find equal status to Windows. -
ALLurGroceries Vegan Vermin Super Moderator
Hey please check out this paper (published May 9, 2008) it pretty much sums up my posts with regard to economies of scale in open source. The author, Virach Sornlertlamvanich, was awarded the "The Most Outstanding Researcher of the Year 2003 (Information Technology and Communication)" award from The National Research Council of Thailand.
http://www.tcllab.org/virach/paper/virach/JCSSE2008-oss-virach.pdf#page=3
Edit: here's the google cache if that doesn't work
http://64.233.169.132/search?q=cache:Obu_xEipLYUJ:www.tcllab.org/virach/paper/virach/JCSSE2008-oss-virach.pdf+JCSSE2008-oss-virach.pdf&hl=en&ct=clnk&cd=1&gl=us&client=firefox-a
-
I'm just gonna inject a bit here, it seems that lots of people are talking about the economics of software without mentioning that the marginal cost of software is essentially zero. Which is why Linux and open-source succeed, and all other shrink-wrap software sales depend on artificial scarcity in this age of the internet. That also changes the fundamental nature of development of open-source because of the rapid communication and cost-free duplication of work, which makes it easy and even a good idea to push changes out quickly. It has nothing to do with economies of scale, it has everything to do with the fact that the economy is working on a zero margin cost, so it doesn't NEED scale to make things cheap to duplicate.
-
ALLurGroceries Vegan Vermin Super Moderator
-
I did read the paper. But the author made no mention of the distribution costs that affect software in general. He's still writing about it in "classical" economic terms. I'm sure he's well respected, but even Alan Greenspan makes mistakes. The economy of software changes completely because of a divide by zero error, basically. His arguments on the supply side are correct, but he's only got the mc part of E=mc²
-
ALLurGroceries Vegan Vermin Super Moderator
Windows is still distributed as DVDs in a retail box, so unless you're talking about pirated software "classical economics" still holds until they change their primary method of distribution. I hear your point, but it's way ahead of the curve, not to say it's wrong, I just can't argue a point on those terms because it's uncharted territory... Proprietary software in general is still distributed on physical media with a COA and packaging.
-
It is. But more and more people are moving to services like Steam or downloading Linux. And internet copying is driving out disc copying businesses, the old-school pirates of both media and software. It's not an overnight change, but people are realizing that they can get stuff like Linux and OpenOffice, then look at how much they pay for Windows and Office, and say "rick? Why?". It's actually happening a lot in business. Oracle, us, a lot of big vendors recommend using Linux on the server simply because it runs faster and works better than Windows.
-
ALLurGroceries Vegan Vermin Super Moderator
Yeah, that is very true, I am very interested to see how the economics play out with distribution and I take your point 100% but it's not yet on an industry-wide scale where we can debate on the same kind of terms the benefits of open sauce vs not, as in the supply side.
-
BTW, the discussion at that point was about the virtues of open source in general, not just Linux. The Linux kernel itself does receive a good deal of contributions from people whose job it is to improve the kernel, no doubt about that. So it is a stellar example of how real jobs and open source (or preferably free software) are not in an absolute sense mutually exclusive. Still, for the majority it is hard to combine the two successfully, even as a hobby. The group of people who can do it are just "the tip of the iceberg". -
ALLurGroceries Vegan Vermin Super Moderator
Yeah they are a constraint, but man hours are not the issue... you miss the point. They are one part of the equation, which also includes IP/licensing costs, advertising, distribution, etc...
-
-
ALLurGroceries Vegan Vermin Super Moderator
Read that paper tho it's full of good info
Edit: for your convenience here is a link to the post with the article and a tasty quote:
http://forum.notebookreview.com/showpost.php?p=4230939&postcount=36 -
ah hell, let's just go commercial and get away from MS....I'll pony up 39 bux for my distro of choice.....LOL
Final Form of Linux
Discussion in 'Linux Compatibility and Software' started by Pikachu, Oct 5, 2008.