Hi All...
I have a question...
Is it possible to have an operational computer WITHOUT an OS?
If yes, are there any examples? Even 'concept' machines would be helpful as examples. Any suggestions to read up on this subject ('bare computing') would be very helpful.
Thanks,
-
lineS of flight Notebook Virtuoso
-
There was actually something like that. It was a concept. I don't remember the name though. Sorry
-
I don't think so.. Why are you interested anyways?
-
AKAJohnDoe Mime with Tourette's
Electrically, yes; Operationally, no.
-
lineS of flight Notebook Virtuoso
-
lineS of flight Notebook Virtuoso
Edit: Mods if this is off topic, please do the needful and accept my apologies! -
as far as I know, the hardware can interface directly with some aplications but those aplications would have to be developed specifally for that hardware (embeded programing) and if you change some part of the hardware or upgrade or whatever the interface might suffer, so the OS interprets the information the aplication wants to send or get, and translates it in to hardware signals.
http://computer.howstuffworks.com/operating-system.htm
this is a very quick guide to their specific functions.
sorry if you already know this information, but I think this is the main reason.
if I'm wrong please feel free to correct me. -
What, precisely, is the aim of your enquiry? Any digital computer needs some minimal set of instructions that tell it what to do when it's turned on; otherwise, it's just an intricate lump of silicon that's eating up juice and needlessly enlarging your carbon footprint (to use the current "cutesy" p.c. terminology). The BIOS that every current computer comes with is, in fact, a form of rudimentary operating system that " bootstraps" the computer and gets it off and running. The BIOS is basically a set of instructions that are hardcoded into chips on the motherboard that are generally permanent, subject to being "flashed" - i.e., on chips that are persistent - nonvolatile, unlike RAM - and can only be reprogrammed using specialized techniques.
The contender for replacement of the BIOS, which is truly ancient in the IT timescale, is the so-called Extensible Firmware Interface, which is designed to be a software interface between the actual hardware on the system and whatever comes next in the hierarchy - to date, that's the operating system. At bottom, however, EFI doesn't completely replace the venerable BIOS, it just moves most of its non-POST and initial configuration work into more amenable (and amendable) software, POST and initial config would still be left to something like the current BIOS, as described in this UEFI_FAQs_Q&A_7.
Beyond that, the OS is basically an ur-application that creates an environment within which every other application is executed - the OS, for example, manages the interactions between an application and the underlying physical hardware, albeit indirectly as the OS' interaction with the physical hardware is itself moderated by other firmware (e.g., MS' HAL, or hardware abstraction layer) or the BIOS itself (by means of interrupts that the OS can call to get certain things done, like having the onboard speaker issue a beep).
On that level, most current OSes are fairly monolithic (I use the term more in its vernacular sense and not within the term-of-art meaning it has in the OS development world) inasmuch as the OS presents a fairly unified face to the user. In principle, however, there is no reason why an OS needs to appear as a single, monolithic entity to the user, and could be designed as a core nucleus that picked up or dropped off various modules as needed/wanted by the user. -
To the extent that one of these devices is sophisticated enough to warrant having applications that are not integrated into the underlying OS, then yes, just like in a normal computer, these "applications" would be buffered from the underlying hardware through the mediation of the OS; if the system is sophisticated enough, the OS itself might be further mediated from the physical hardware by some sort of more primitive firmware such as MS' HAL.
The "why" for it is that each individual physical component has its own idiosyncracies, and if one is coding a piece of software that is intended to function on several different types of systems, without any guarantee of the sort of physical hardware that will embody the system, there has to be some way of allowing the coder to take advantage of economies of scale by writing one piece of code that uses standardized means of interacting with the hardware, by relying on the existence of another piece of software that has been coded specifically for the actual set of hardware on which it operates, and which is capable of translating the standardized interactions of the top-level program into the individual idiosyncratic methods used by the particular bits of physical hardware on which the firmware operates. -
AKAJohnDoe Mime with Tourette's
And very few applications these days interface directly with hardware. I last wrote an EXCP (channel program) in the 1970's. -
lineS of flight Notebook Virtuoso
Shyster1...
Thanks! That was excellent!
So, let me see if I understood you correctly.
1. We are basically 'locked' into UTM paradigm...is this a limitation of the hardware or of the concept, or, cynically speaking of the production-line system that has been laid down thus far?
2. If the BIOS is a rudimentary form of OS, then it is theoretically possible that we could think of a modular machine which will comprise of a number of hardware units each with their specific functions that we can plug together (quite like Lego) and use as per specific applications. Yes? Here I am thinking of the car-model. If this holds then it is possible - again theoretically - to have a 'computer' or a machine that simply connects to the internet? The only non-machine based application would be the browser? (btw...this is NOT about Chrome!)
3. If the model you described which is based on the UTM model is near universal then how do mission critical systems work...Where a failure of the OS and the inability of being able to correct the problem onsite or even remotely would be disastrous? Do these have, as AKAJohnDoe puts it, 'embedded' firmware?
Thanks again for your detailed answers! -
lineS of flight Notebook Virtuoso
Thanks -
AKAJohnDoe Mime with Tourette's
Take the following examples:- IF OPEN
- CLI 12(R4),X'80'
- 9580400C
Which would rather write and read? -
Second, the universal turing machine is not a "paradigm," it's a theoretical construct for a machine that can be programmed to do any task that can be described in a discrete set of steps or instructions (if I recall my turing correctly) - the term "universal" refers to the fact that the machine, at least in Turing's theoretical construct, is not limited to a proper subset of programmable activities, but can perform any activity that is programmable - i.e., set forth in a finite set of discrete instructions, not to the fact that the implementation of that conceptual construct is extremely widespread.
We are "locked into" that conceptual construct so long as we continue to use it as the basic blueprint for designing programmable machines - find a different construct, and you'll find a different set of constraints. The fact that the universal Turing machine, as concretized by Von Neumann, forms the basis for all digital computers today is more a matter of historical accident than either inevitability or any of the soft-science isms of which I'm beginning to hear echoes in your posts.
It's a little like why the US uses a 60Hz electrical power grid - some number had to be chosen, and the folks who were there at the time that the choice had to be made, simply made a choice based, largely, on what was easiest for them to do. It was probably also due to the fact that, in terms of viable theoretical constructs for use as a blueprint for developing programmable machines, there weren't many other contenders to Turing's theoretical construct - if basketball's the only game in town, is it any wonder that the only sports stars around play basketball?
Without researching the issue, I am quite confident that, by now, the legions of earnest post-docs in mathematics and computer sciences have developed a number of alternative conceptual constructs to the Turing construct, so if you want an alternate construct, I'd start with the Turing Wiki, and see where the links lead. Whether any of those alternatives could be implemented in physical hardware is anyone's guess. One faint possibility I can think of off the top of my head is the so-called field of quantum computing; assuming the Copenhagen interpretation of quantum theory is correct, technologically feasible quantum computing may make non-Turing constructs technologically possible.
Of course, one could create a device composed of various modules with hardwired code that could be connected together in various ways, such that one could re-create a device capable of supporting an internet browser, but that would be a step back, not a step forward, because that's about all that device could do, so you would lose the universality (or, more properly speaking, quasi-universality) inherent in the current general-purpose digital computer.
You'd also end up with a more expensive device, given that hardware is almost always more expensive than software, because you'd have to devise a new device for every different use, even for different/updated browsers, instead of just running a different set of programmed instructions (i.e., software) on the exact same set of hardware.
The area of mission-critical systems is an entirely different area of development from that of the ordinary consumer-level OS. You might try googling on real-time operating-systems to see what pops up. Needless to say, the OSes used have real-time access to the data they need and the hardware they need to control, and are severely tested to shake out any possible bugs (it being theoretically almost impossible to guarantee that you've gotten every possible bug out). That is one reason why the US Space Shuttle, for example, still flies with guidance computers that were designed in the 1970s, and that are, by modern standards, almost irretrievably backward - the computing power in the average cell-phone probably outweighs the computing power in one of the Shuttle's flight computers.
In addition, for a mission-critical implementation, you don't typically rely on an off-the-shelf, general-purpose, retail solution - you hire a group of very experienced coders and engineers to custom design a hardware/software setup that is one of a kind, so many of the issues relating to fungible general purpose computers such as the consumer models being sold nowadays, never come up. In that case, there's no need to interpose a hardware abstraction layer between the OS and the physical hardware that is designed to handle many different applications, written by many different programmers, using a standardized set of hardware interactions; all you need is an OS that directly interacts with the hardware (which, BTW, is what, e.g., Microsoft's consumer operating systems did until the advent of, I believe, _Win95; prior to that, the OS did, in fact, have pretty direct access to the hardware, which is one reason why many older applications - 16bit and 8bit DOS applications, for example - cannot run on current machines, because they are coded with the expectation that the application can communicate directly with the hardware (e.g., by using interrupts, which access BIOS functions directly), an expectation that modern versions of _Windows no longer support, in part for security reasons, but also in order to allow simultaneous execution of a number of different applications, which requires that the OS essentially act as a sort of ersatz virtual machine). -
lineS of flight Notebook Virtuoso
Hmmm...I will follow your advise on the reading!
Actually, my interest is in defence-related hardware and in that sense is, as you put it, soft-science-istic.
Your lead on 'real-time' operating systems is also very helpful and I will follow up on that.
Again thanks for yet another detailed and patient response. -
AKAJohnDoe Mime with Tourette's
Defense software is unique. For example, the AWACS ran JOVIAL code last I knew. A bit esoteric. Most folks who know much more cannot tell you any more.
Where is your IP address? Where do you live anyway? -
It is esoteric - even some critical-mission civilian systems get esoteric, from what little I've gleaned in my wanderings.
For the bored and the curious, the Space Shuttle flight systems consist of 5 IBM 32-bit general purpose computers based on the Model AP-101; however, in the 1990s that model was upgraded to the AP-101S with the introduction of semiconductor-based memory, which replaced some of the old TTL hardware (check out the pix on that last link).
In terms of software, I'll just quote the Wiki article on the AP-101:
JOVIAL, aka "Jules Own Version of the International Algorithmic Language" was originally written by Jules Schwartz in 1959 to write software for the electronics of military aircraft, was standardized in 1973, and revised in 1984.
By way of contrast, one of the modern workhorse civilian programming languages is C++, which was originally developed in 1979, formally named "C++" in 1983, and was last revised in 2003, and the next version is currently under active development.
That should give a little more flavor to some of the differences between civilian/consumer systems and mission-critical/real-time/military systems.
For the OP, a draft paper surveying realtime operating systems, written in 1994, can be downloaded in pdf here. -
-
lineS of flight Notebook Virtuoso
@Shyster1: My specific interest in defence-related hardware is in the domain of sensors and sensor densification prospects and swarm-related developments, which taken together is basically the need, but also the general orientation in recent times to collapse the sensor-to-shooter-to-sensor (ad infinitum) links. Additionally, in the world of sensor tech., remote handling and operations of units is key - as you may already know. But this remoteness is unlike the UAV model. In the context of sensors and swarms, 'recall' systems are (ideally and theoretically) totally dispensed with.
Thanks for the link. It was enlightening.
@AKAJohDoe: I split my time between the UK and India. Why do you ask?
Thanks to all for your comments and lead. They have been very helpful.
OS-related Question - maybe off-topic!
Discussion in 'Windows OS and Software' started by lineS of flight, Sep 10, 2008.