Binary vs. source code

Programming, for all ages and all languages.
Kevin
Member
Member
Posts: 1071
Joined: Sun Feb 01, 2009 6:11 am
Location: Germany
Contact:

Re: Binary vs. source code

Post by Kevin »

h0bby1 wrote:well i'm just exploring the reason why one would want to release sources of his product, if it's not intended to developer audience, and just for user, the interest to have the source is pretty nill, and still most open source project are released as source code, even if they don't target potential developers, which doesn't provide clear interest for the user
[Citation needed]

One of the main points of open source development is to enable other developers to contribute, and I'm pretty sure that's an important goal for most projects.
it depend also kind of open source software you use, but i see people getting into linux often, and you never get something that work 100% with all multimedia devices, printers, scanners, webcams, all functions, and soft who works like a charm like that straight out of the box, it's not rare that it can take a week or more to have everything well set up, and it often involves still having to tweak some things, due to bad compiling environment, bad dependency, because of so many version of the binaries can be potentially present, on closed source software who only release binaries, there is not this problem as all the systems will have a well defined version of binary files, so you know exactly what to expect from the binaries present on the system
Parse error. I'm not sure what this is, but definitely not a single, well-readable sentence.

I'm not sure what you wanted to say, so I'm not sure if this is a good answer to it, but anyway: I'm running Linux on this laptop, mostly just a standard installation with defaults. My printer works fine, the built-in camera, too, and all the other hardware I even tried to use just worked as well. And getting it installed didn't take a whole week, but more like half an hour (plus copying stuff over and configuring software like for VPN, email accounts etc. - make it a day). I'm certainly not faster with a certain closed-source OS.

Ten years ago, it was often necessary to edit some configuration files in order to make things work. But still, that were configuration files, not source code. Nobody ever expected from me that I as a user fix the Linux kernel or something.
but yeah after it can be quicker from a developer perspective to just give away sources with a doxygen doc, instead of really making manuals, testing, error checking and report, and rely on third party to fix bugs or make it work on their system for their particular use from the sources, instead of anticipating all case for that end user, or sys admin have less work to do to install and use the software
There is absolutely no connection between "give away sources with a doxygen doc" and "really making manuals, testing, error checking and report". You can do either of them, you can do both, and you can do none. Completely independent choices. Some open source projects have great manual and large automated testsuites, and some closed source projects have barely any testing (there was a deadline, sadly...) and crappy documentation.

Your second point, "rely on third party to fix bugs or make it work on their system for their particular use from the sources, instead of anticipating all case for that end user", is a bit different. The problem here is that it's not possible to anticipate the needs of every single potential user in the world. So instead of giving the user the perfect piece of software, you rather tell him: Sorry, doesn't work on your platform. Maybe in the next version.
Developer of tyndur - community OS of Lowlevel (German)
h0bby1
Member
Member
Posts: 240
Joined: Wed Aug 21, 2013 7:08 am

Re: Binary vs. source code

Post by h0bby1 »

Combuster wrote:
h0bby1 wrote:well i'm just exploring the reason why one would want to release sources of his product, if it's not intended to developer audience, and just for user, the interest to have the source is pretty nill, and still most open source project are released as source code, even if they don't target potential developers, which doesn't provide clear interest for the user
And yet, the vast majority of all Linux software is downloaded in binary form, rather than source form. By end users. Even if the original developer only released source code. Something somewhere must be going horribly wrong... :-k

In other words, you're argumenting based on a problem that doesn't even exist.
in this case, also need to take in consideration that there is several developer team involved, there is for example, the apache team ,and the linux distribution team, and the binary package of apache downloaded from the distribution is not compiled by the apache developers, but by the developers of the linux distribution, so if you consider that apache release mostly source code, the user of the apache software are linux maintainer who compile it for the final end user, who still couldn't have the binary without a third party outside of the apache development team, and it add more work and more testing for distribution maintainer or a third party while bringing close to nothing to end user
h0bby1
Member
Member
Posts: 240
Joined: Wed Aug 21, 2013 7:08 am

Re: Binary vs. source code

Post by h0bby1 »

Kevin wrote:I'm not sure what you wanted to say, so I'm not sure if this is a good answer to it, but anyway: I'm running Linux on this laptop, mostly just a standard installation with defaults. My printer works fine, the built-in camera, too, and all the other hardware I even tried to use just worked as well. And getting it installed didn't take a whole week, but more like half an hour (plus copying stuff over and configuring software like for VPN, email accounts etc. - make it a day). I'm certainly not faster with a certain closed-source OS.

Ten years ago, it was often necessary to edit some configuration files in order to make things work. But still, that were configuration files, not source code. Nobody ever expected from me that I as a user fix the Linux kernel or something.
ten years ago it was close to hell to run linux, but even in fairly recent distrib there are still many bugs like copy pasting not working well, or even crashing apps, it happened to me even not so long ago, copy pasting something from a code editor to xchat = crash, and then one would say 'thanks god i have the source like this i can fix it and post a 1000th patch to gtk', but it would still be better if there was no need for source to be distributed at all anywhere because everything work with what is released directly by the developers without any third party to be involved at all
Kevin wrote: There is absolutely no connection between "give away sources with a doxygen doc" and "really making manuals, testing, error checking and report". You can do either of them, you can do both, and you can do none. Completely independent choices. Some open source projects have great manual and large automated testsuites, and some closed source projects have barely any testing (there was a deadline, sadly...) and crappy documentation.
if you don't want to write manuals and documentation of the software in proper way, it can still be usable by making a doxygen + source code , and the need for source code is limited if the software is well documented, well tested, with good error report, even for debugging or beta testing, or if you want to use the software unmodified as part of a development project but still having debugging/error tracking capacity
Kevin wrote:Your second point, "rely on third party to fix bugs or make it work on their system for their particular use from the sources, instead of anticipating all case for that end user", is a bit different. The problem here is that it's not possible to anticipate the needs of every single potential user in the world. So instead of giving the user the perfect piece of software, you rather tell him: Sorry, doesn't work on your platform. Maybe in the next version.
well if you can't anticipate users need and being more efficient than users at making software that match that need , you'd rather do something else than development

but yeah in the case that you really release sources to speed up development, to increase speed of bug fixes, it can be useful, but need to also keep track of all submitted patches, but then it's a software that is 'work in progress', 'in development', and in that case there is no point to release binary or anything for end user, but with open source software they are always a bit the middle, having software that is almost constantly a 'work in progress', and need third party middle man to make it really useable by an end user, like the distribution maintainer who have to compile certain version of apache in the good system context, instead of the apache dev team releasing directly ready to use package directly to end user, in sort that end user would not systematically rely on distribution maintainer to get their software compiled to an usable state in their distrib
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: Binary vs. source code

Post by Combuster »

h0bby1 wrote:ten years ago it was close to hell to run linux
What was the name and version of the first linux distro you installed yourself?
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
User avatar
iansjack
Member
Member
Posts: 4706
Joined: Sat Mar 31, 2012 3:07 am
Location: Chichester, UK

Re: Binary vs. source code

Post by iansjack »

ten years ago it was close to hell to run linux
That is absolutely untrue. 10 years ago Linux was already a mature product. I was using it over 15 years ago to run internal DNS server for an organization of about 1,000 employes. That was a version (I forget the version number) of SuSE Linux and the servers ran 24/7, 365 days a year.
User avatar
Thomas
Member
Member
Posts: 281
Joined: Thu Jun 04, 2009 11:12 pm

Re: Binary vs. source code

Post by Thomas »

bluemoon wrote:
Thomas wrote:you can pretty much figure out what is going on from the crash dump. I knew engineers who could re construct the source code almost instantly by just looking at the dump.
With decent crash dump, for developer with the same version of binary and debug infos, they can relaunch the application in the debugger right at the fault point and stare on the break point from the source window.

However, sometime the breakpoint / crash point may be very far away from the real bug.
That is what some one like me would do, I am not as great as some of the orginal project designers and engineers. They probably know the source too well. But you infer a lot more if current stack is not corrupted and if you know the compiler generated code well enough.( it greatly helps to know the application data structures at least on a high level )
--Thomas
h0bby1
Member
Member
Posts: 240
Joined: Wed Aug 21, 2013 7:08 am

Re: Binary vs. source code

Post by h0bby1 »

iansjack wrote:
ten years ago it was close to hell to run linux
That is absolutely untrue. 10 years ago Linux was already a mature product. I was using it over 15 years ago to run internal DNS server for an organization of about 1,000 employes. That was a version (I forget the version number) of SuSE Linux and the servers ran 24/7, 365 days a year.
when i started to use linux in 98, there was no fat 32, and many things were not really good, and close to zero application, for some simple server application it was probably fine if you were sys admin to install and configure it, but windows was not that great at that time either

10 years ago i guess it was rather ok, but not for everyday home use with multmedia applications, like video editing/playing, a good equivalent to photoshop and 3Ds max, good support for recent graphic hardware, support for commercial games, and that sort of things, 10 years ago to have all this working on a linux station was not trivial
h0bby1
Member
Member
Posts: 240
Joined: Wed Aug 21, 2013 7:08 am

Re: Binary vs. source code

Post by h0bby1 »

Combuster wrote:
h0bby1 wrote:ten years ago it was close to hell to run linux
What was the name and version of the first linux distro you installed yourself?
i got a redhat and slackware cd at some place in 98, i don't remember the versions =) the guys who gave them to me had laptop under debian, but i didn't know anything about linux at that time, it was even before i had the adsl, i installed them to test opengl and C, because windows was not really great either before win2K, so i used linux for a period, and also for web service, but i don't use it anymore as desktop
h0bby1
Member
Member
Posts: 240
Joined: Wed Aug 21, 2013 7:08 am

Re: Binary vs. source code

Post by h0bby1 »

what i consider problematic in the way open software is distributed is specially the step involved with the "configure" file, because it's a step that is required for the application to be run on the target system, and it involve a complex process in itself, and it make it harder to know exactly what is being installed on the client system, because the compilation process depend on local system parameters , including gcc/libc version, and it's mostly due to this process of configure/build to make the software usable on the target system that they release the software as source, more than purely for developers

and this configure step is mostly about figuring out which version of which library and/or header file is installed on the system, to have the program compiled using the same version of the api used on the system, because it's otherwise hard to figure out exactly the version of the api at runtime under linux, under windows applications can figure out which versions of which interface is present on the system at runtime with the COM system, and their own version of the libc, but they could also develop open source system with a COM like system, it would just need everyone implement the interface using same standard definitions, and then a binary could be run directly on any unix system exposing the standard set of interfaces, without needing any kind of configure/build step

if you take let say apache and debian, apache is distributed in binary form in debian, but are you really sure that debian didn't modify anything at all from the apache source code or build/configure files ? what garantee or support can apache really give about those binaries ? who can give any support for those binaries at all ? distribution maintainer ? not even really , it's what is a bit confusing with open source release, as the final user who use apache under debian, he is not using an "apache product", but a "debian product" , and so what is has is a debian version of apache, a product of the debian developer team

compared to if apache team would focus at some point on a particular version of the source, then totally freeze it, and then making this exact version the standard with the set of feature it support, and then producing binaries of this version for all configuration, distributing a set of 'official binaries' for each platform, it would be much easier to figure out exactly what is on the system at runtime if all the binaries were distributed with an unique or clearly identifiable version including the configuration used to compile it, with open source software the configuration used to compile it is generated by the target system and the purpose of releasing source is because it's not easy to figure the installed version of system api's at runtime
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: Binary vs. source code

Post by Combuster »

h0bby1 wrote:ten years ago it was close to hell to run linux
h0bby1 wrote:windows was not really great either before win2K, so i used linux for a period
There goes your own argument argument down the drain :wink:
h0bby1 wrote:i don't use it anymore
So you missed a good decade of Linux while you did see the progress of Windows. That's... good to know.
h0bby1 wrote:configure (...) because it's otherwise hard to figure out exactly the version of the api at runtime under linux
And the only reason you do it under Windows is to simply include windows version and service pack in your logfiles. You still build your app against one preselected version of the API. There are no magic upgrades from DirectX 7 to DirectX 10 simply because you run the latest version, and neither do you get the same updates on Linux. And therefore what you claim can't be the real purpose of configure.
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
User avatar
iansjack
Member
Member
Posts: 4706
Joined: Sat Mar 31, 2012 3:07 am
Location: Chichester, UK

Re: Binary vs. source code

Post by iansjack »

when i started to use linux in 98, there was no fat 32
So you're saying that, two years after it was developed, FAT32 still wasn't supported in Linux (although I'm not sure of the relevance of FAT32 to Linux) and that made Linux "close to hell"? I guess that means that NT 4.51 still wasn't ready for the big time. :shock:

Edit: It doesn't really matter, but I thought that I'd check. FAT32 support was introduced in Linux 2.0.34 - in June 1998. It wasn't introduced in Windows NT until December 1999.
Post Reply