libcdio-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Libcdio-devel] Pete Batard's Changes (MinGW, MSVC, UDF, Joliet, hea


From: Rocky Bernstein
Subject: Re: [Libcdio-devel] Pete Batard's Changes (MinGW, MSVC, UDF, Joliet, header ...) merged in. Leon's CD-Text changes coming
Date: Sun, 11 Mar 2012 22:08:31 -0400

This is getting long. I'll try to be brief.

A lot of the concerns you have seem to be for people writing applications
(as opposed to end users). Specifically those that use libcdio via MSVC and
MinGW. So although we want to address this concern, it strikes me as a
concern of very few people.

Linking using -llcdio versus -winmm -lcdio is apparently important to you.
On platforms other than MSVC and MinGW, one typically uses pkg-config and
that gives what libraries and options to use.  Although I think it pretty
rare that someone would have to issue a link command by hand as opposed to
through some sort of build system, since this is of importance to you and
you are willing to do the work, I'd say let's go for it. But yes, I'd still
like the boilerplate stuff split out separately in the
lib/drivers/MSWindows directory. We'll make sure to add comments about why
the added complexity.

I think it okay (if not complicating things more)  to put platform
information another in header so version.h can be constant across all
platforms and distributed in the tarball. But so as to reduce changes to
source code, version.h should #include platform.h On MSVC you should create
platform.h as that will not be distributed in a tarball.

Thanks for trying the test with the #include cdio/disc.h and cdio/mswin.h.
Given the problems, I guess adding LIBCDIO_DLL_DECLSPEC is the best of the
bad possibilities. I'd ask however to put that on a separate line. To my
aesthetics, this is better. For example:

    LIBCDIO_DLL_DECLSPEC
    extern const char *discmode2str[];

instead of:

    LIBCDIO_DLL_DECLSPEC extern const char *discmode2str[];

As for getters and setters, it looks like there is already a bit to do for
another release. So best to put that off for now.

If I've missed addressing any concerns, just ask.


On Sun, Mar 11, 2012 at 7:40 PM, Pete Batard <address@hidden> wrote:

> On 2012.03.11 01:14, Rocky Bernstein wrote:
>
>> In all cases, considering that we only use 2 calls for winmm, we might as
>>> well remove the need to specify that library during compilation.
>>>
>>
>> I still don't understand the importance of this.
>>
>
> Very simple. A good library is one that saves time for its users.
> In fact, this is what I see as the prime rule of good software.
>
> No matter how much you try to document it, and how red you flag it, you
> can count on people trying to use libcdio in their app by simply linking
> with -llibcdio alone and missing the winmm requirement. And when they do,
> they will get unresolved symbols that they have to dig out, which may or
> may not turn out not to be that easy to link to winmm...
>
> Thus, by not including a 15 mins tried and tested code addon, you are
> going to cause a lot more than 15 mins time loss for the collective of
> developers. And as time goes by, the odds are certainly skewed with the
> developers collective loosing much more time than what you think you saved
> for yourself and your own developers by keeping the winmm requirement.
> Therefore, I see spending a bit of time there as a way to both save time
> for libcdio users, and because the library has now been made simpler for
> them to use, make them see it in a more positive light.
>
>
>  Don't MinGW and cygwin
>> normally have winmm.lib around?  Is it unlikely they will *not* be around?
>> Same question for MSVC.
>>
>
> They have it, but this is not the issue. The issue is that linking a non
> multimedia app to the Windows multimedia lib, because libcdio physical disc
> access requires it for 2 calls, is not obvious at all and is going to cost
> time to people which we can very easily avoid them wasting. If it was more
> than 2 calls, that would be a different story, but really, with this
> change, we can make any libcdio based app on Windows only need -llicdio to
> compile. That's pretty neat.
>
>
>  Here is perhaps a not great analogous situation that expresses my thought
>> here. Suppose it were found that the paranoia library uses the
>> mathematical
>> sine function in a couple of places. One could write a sine() funciton and
>> avoid the math library.
>>
>
> But we're not writing our own sine() equivalent here.
>
> We're writing our equivalent to the code, that is found in the import lib
> to allows us to use the sine() from math.dll. And this code is actually
> very simple: get a handle to the DLL, lookup the entry point and use that
> entry point whenever we need to call on sine(). That's really all there is
> about it.
>
> A more accurate analogy would be: suppose libcdio uses the boost library
> for 2 function calls that themselves leverage calls that exist in a native
> (non boost) OS library. Instead of requiring everybody to link against
> boost, because of these 2 calls, we know we can reuse common prior art for
> that task. Shouldn't we then be looking into ways to remove the boost
> dependency?
>
>
>  However, I don't see avoiding use of the math
>> library as that compelling because it is around virtually everywhere.
>>
>
> Except the issue is that people who are not aware of the sine() dependency
> would first try to link their app with -llibcdio only and get unresolved
> symbols. Sure if the unresolved mention "sine", it's easy to figure out
> -lmath is needed. But for "SendCommand" and "GetErrorString", it may not be
> so explicit...
>
>
>  And
>> the sine function in our library should have to have a test for it just as
>> I assume there is a test of sine in the math library distribution.
>>
>
> Except we are not duplicating the function call at all, just accessing the
> one from the DLL directly instead of going through an import lib.
>
>
>  I could factorize it if you think that's needed.
>>>
>>
>> I'd appreciate it. Putting as much mostly MSWindows-specific code inside
>> the MSWindows directory would be great. Once in that directory, I am less
>> concerned with complicated or ugly it is.
>>
>
> Well, the thing is, the code we are talking about is _already_ to be found
> in MSWindows-specific parts of the libcdio source: win32.c and aspi32.c.
>
> Are you saying that we should remove Windows-specific code from
> Windows-specific sources and move it into a different Windows-specific
> directory, in case someone looking at the original Windows-specific source
> is not familiar with another Windows-specific section of code we added?
>
> Trying to factorize makes sense, so I'll look into it, but moving the
> factorized code outside of the Windows driver directory doesn't, especially
> as it has little to do with MSVC: MinGW and cygwin developers will benefit
> from not having to link their apps against winmm.lib too, so it's not an
> MSVC only improvement.
>
>
>  I think the benefit of having libcdio apps compiled with just -llibcdio
>>> rather than -llibcdio -lwinmm is worth the trouble.
>>>
>>
>> Aren't windows apps, especially those compiled with MSVC generally
>> available as downloadable binaries? If so, I don't understand why you
>> think
>> it is worth the trouble. But it's okay that I don't understand. I accept
>> that different people while have different views and aesthetics and
>> tolerance is a good thing.
>>
>
> See my first paragraph. A good library is one that can alleviate its users
> wasting time (even if said users are supposed to read the documentation to
> figure out that -lwinmm is needed besides -llibcdio when compiling with
> MinGW, cygwin or MSVC). I've used this code over and over and it was really
> a 15 minutes job to add it to libcdio. Therefore I thought I may as well
> avoid libcdio based apps developer wasting time on additional library
> dependencies.
>
>
>  Where I was coming from is someone later on who wants to contribute or
>> understand or modify the code and sees what apparently is boilerplate code
>> that normally is in a library. (But they might not be aware of that).
>>
>
> If they are modifying the win32.c or aspi32.c sources, which are
> exceedingly Windows specific and require some knowledge of Windows
> programming, I bed to differ. In fact, I'm going to be worried if they try
> to modify the code we have there yet can't understand elementary code that
> avoids having to go through an import lib to use a DLL, as this is fairly
> common.
>
>
>  Would you force all Linux developers wanting to recompile libcdio to go
>> and
>>
>>> download an extra utility if the situation was reversed?
>>>
>>
>> First, my understanding of how Windows users work is they expect to
>> download executables and not build software from source -- that's why
>> Windows generally doesn't have things like "sed" or "grep", and why the
>> "bat" command language is a bit lacking.
>>
>
> Except, we are talking about apps developer who, in the absence of
> binaries being provided (which becomes a non-trivial matter if you're going
> to provide a DLL that can be used with both MSVC and MinGW), will attempt
> to recompile the library. So they already have a development environment
> installed.
>
>
>  Second, to some extent this *is*
>> the situation in building libcdio. To build the code from source, you have
>> to have "gcc" and "make" installed; these are generally listed under a
>> "developers tools" category.
>>
>
> Yes, and to build libcdio with MSVC you need to have Visual Studio
> installed. The difference being that once you're past this stage, on a
> POSIX environment, tools like sed, grep, etc. are expected to be available
> by default. For Windows/Visual Studio, they still wouldn't.
>
>
>  One can perfectly get by with Linux systems
>> that don't have "make" or "gcc" installed. Instead using package managers
>> to install code. Many live CD's and hardened servers work this way.  I
>> imagine most Linux users install libcdio from a distribution that provides
>> pre-compiled binaries in rpm or deb format.
>>
>
> The same line has been advocated by the libusb maintainers for pretty much
> the same reason: "All the non-autotools integrable modifications MSVC
> requires for compilation are a bother, whereas Windows developers are
> expected to use mostly pre-built binaries and not recompile from source, so
> we should be able to afford to request them to download sed, grep, or
> something else".
>
> Of course, little did the libusb maintainers know that trying to dismiss
> recompilation from source on Windows would turn into an even worse
> nightmare. By promoting the binary-deliverable route, and unlike what is
> the case with recompilation from source, you are now faced with calling
> convention and DLL interchangeability matters, and these have a very
> visible and dramatic impact on all existing headers.
>
> You may not realize it, but I'm actually trying to do you a favour by
> avoiding a repeat of what we've seen on libusb, and the way to do that is
> by not placing the onus on binary deliverables. This is because I can
> guarantee that, if you don't like what you see so far, you sure aren't
> gonna like what would happen if we go with binary deliverables as the
> preferred delivery method for Windows. That is because you then you cannot
> ignore the requirement to set a calling convention for ALL public libcdio
> calls (=> all calls will have to be prefixed by a macro, rather than just
> the global variables) and other matters.
>
> Therefore, unless you actually want to provide binary libcdio deliverables
> for Windows, recompilation from source has to be considered as the approach
> we want to promote on Windows, and with MSVC being the major development
> environment there, we may want to thread carefully with extra requirements
> besides it.
>
>
>  The only acceptable way I see, outside of providing a sed.exe in the
>>> libcdio repo, is if we recompile and run our own simple search/replace
>>> parser in MSVC as a pre-build process, which can be done, but this adds
>>> to
>>> complexity and maintenance.
>>>
>>
>> That's actually along the lines I was thinking.  And this search/replacer
>> in MSVC could be used across many projects of an autotools ilk. (I write
>> "autotools ilk" because I am sure there are projects which could also use
>> this kind of search replace that don't use "autoconf" or "automake".)
>>
>
> Then, it looks like this will be the way to go, as forcing all MSVC users
> do download sed just to create a version.h is not something that I consider
> as acceptable.
>
> I'll see what I can do, but this will add extra complexity and
> maintenance...
>
>  The way I do it is, instead of having the maintainer manually editing the
>>> version of configure.ac, I have a chver.sh that I run from a shell
>>> script
>>>
>>
>> Hmmm. In theory I'm not against this approach, but it means deviating from
>> the already arcane autotools process, so it is likely not to be followed.
>>
>> However what we could add a "make" target to create the MSVC version.h
>> file
>> in the MSWindows directory. Also we can arrange that making a distribution
>> tarball includes the MSWindows version.h.
>>
>
> And now I realize that the current libcdio tarballs don't have a
> version.h, which I can't help but find strange.
> Most of the library tarballs I know tend to have a ready generated
> version.h in their headers. Else I think you are asking for trouble (and as
> you highlight below, trouble because of it is apparently what you got).
>
> Now it seems to me that the only reason libcdio doesn't generate their
> version.h from configure.ac and autoconf is because it wants to have the
> platform info in there as well. IMO, the platform and the version should
> really be broken down into separate entities, which may actually solve some
> of the problems we are facing.
>
> Having a version.h that contains only the version numbers, and then a
> platform.h, that would be generated from running configure (from a
> platform.h.in) or picked directly from a prexisting plaform.h in the MSVC
> dir, and used along with the version number from version.h to concatenate a
> string with both version and platform, sounds like a much better approach
> to me. And the tarball would include a ready-made version.h, and
> MSVC/platform.h and an include/platform.h.in.
> Actually, I could use some improvements to the version.h file for the DLL
> versioning, as the major, minor and micro are not provided independently in
> the current version.h.
>
>
>  But a problem with all of these approaches where something is done on a
>> system other than the one you build on (i.e. not done as a MSVC pre-build
>> process) is that we can't record the architecture (e.g. Intel x86) and
>> 32/64-bitness of the OS.
>>
>
> There may be ways around that, depending on where exactly you need the
> platform specifics, but let us focus on the current issue for the time
> being.
>
>
>  Do you have evidence that there exists cases where autotools will fail to
>>> overwrite an existing file from a .in?
>>>
>>>
>> Actually, yes. I have gotten so many incredibly stupid questions about
>> software I've written that I do not want to entice people to do something
>> stupid. Having a wrong version.h file around is such an enticement. People
>> who don't have autotools working or working properly have on more than one
>> occasion started modified Makefile.in or copied files from elsewhere.
>>
>
> Well, sorry, but this is still not evidence that the presence of a
> version.h prevented autotools from generating one, which is the question I
> asked.
>
> Whether or not a version.h exists, if autotools is broken, people will
> have a problem regardless, and, as I stated above, your choice of not
> providing a version.h in the tarballs, because you also want it to contain
> the platform info, seems to be the root of your problem.
>
> If we go with a breakdown of platform.h/version.h, unless there is
> evidence that a working autotools is not going to overwrite version.h from
> version.h.in, I still fail to see an issue.
>
> Or are you saying that we shouldn't spend 15 minutes to support people who
> may ignore the requirement for -lwinmm.lib for one reason or another, but
> go out of our way to support people with a broken autotools (which is
> something that I expect to take a lot more than 15 minutes to properly
> implement, especially as we don't have tried-and-trusted code to reuse
> there)?
>
>
>  I
>> don't want to encourage this, so if it means they have to copy  one more
>> file it might keep them from doing it. Having a MSWindows/version.h file
>> around is already a temptation.
>>
>
> Still don't see a good argument against having both a version.h and
> version.h.in in include, especially if we move the platform data in
> another header. In fact, given your experience, the logical thing to do
> would be to drop version.h.in altogether and provide a single version.h
> always, that would be updated, along with configure.ac, from a script,
> and with the platform data moved in a separate header. This way, you can at
> least be confident that even if someone has a broken autotools, their
> version will always be the right one and you will get accurate reports for
> the version.
>
>
>  That can be done (with the caveat that there's no guarantee that Visual
>>> Studio 2012 will not break the move process we would use then),
>>>
>>
>> I don't understand. Visual Studio2012 might not have a pre-build hook?
>>
>
> The problem with VS is that Microsoft has proven release after release
> that it doesn't give a damn about preserving compatibility: Every version
> of Visual Studio's project files have been incompatible with the last.
> Sure, you are generally able to upgrade, but there is no guarantee that the
> upgrade will work, and Microsoft certainly see it as a one way process:
> upgrade only - no downgrade ever.
>
> As a matter of fact, the MSVC project files from the -pbatard branch will
> only work with VS2010 or later. If you use VS2008 or VS2005, you'll have to
> figure out a way to convert them or create your own project files from
> scratch.
>
> Given this, you'd understand that I'm a bit nervous about relying on
> processes that Microsoft may break from one release of VS to the next
> without warning. I've seen a few older project files that don't convert too
> well with newer version of VS, which is why I am cautious.
>
>
>  but I really only see that as necessary if we have actual reasons to
>>> believe that an existing version.h would not be overwritten in some
>>> cases.
>>> The simpler we can keep our versioning the better, and making it more
>>> complex to tackle an issue we have no evidence of being an actual problem
>>> doesn't strike me as a good investment.
>>>
>>
>> Again, it is a problem that I've encountered.
>>
>
> Not really, unless you failed to disclose additional details. The problem
> I mentioned was about autotools failing to overwrite a version.h if one
> already exists, not about autotools failing to generate a version.h even if
> one doesn't already exists. This is a completely different issue, and
> unrelated to the solution I advocated of having both a version.h for MSVC
> and version.h.in for the other platforms. Also, even in your case, the
> worst we would get is people reporting the _proper_ version but the wrong
> platform, which would be exceedingly obvious, since, if they tried to use
> autotools, they wouldn't be using MSVC and if they used MSVC, the version.h
> would be fine. So I actually see doing that as an major improvement in
> alleviating the problems you faced in the past...
>
>
>  So in these cases and where updating version.h isn't automatic but a
>> manual
>> conscious thing for a human to remember, it just strikes me as more
>> reasonable to have that platform maintainer to be the one to make the
>> change.
>>
>
> Except, platform maintainers don't release tarballs. You do. And the
> version should very much be part of the source tarball, with a single
> version.h, as is the case with most libraries.
>
>
>  As a by product, it means that that version as been looked at by
>> someone who knows that platform. As happens, platform maintainers
>> disappear, and so the version number would be the last known good version
>> that works on that platform.
>>
>
> I fail to see how you can make that work, unless you are going to ask each
> platform maintainer to release their own tarball, which I would see as
> asking for more trouble.
>
>
>  That said, sure a file with a matrix of libcdio versions and OS'sit works
>> on would be better and more pervasive. But I don't want to spend the time
>> and effort to maintain that. If there are volunteers -- go for it!
>>
>
> I think the better approach is to assume that, in the absence of recent
> testing, changes that have been introduced haven't broken anything, and
> rely on users for that platform to report if they did.
>
>
>  The declspec has to be part of the public libcdio interface if you want to
>>> use the library as a DLL. So, unless we use a separate include directory
>>> for MSVC,
>>>
>>
>> This is the direction I'd like to explore.
>>
>
> In all duplication, I see more potential for breakage, which is exactly
> what you should be trying to avoid. And again, the root of the matter is
> _really_ to keep POSIX people from seing an MS related macro that doesn't
> do a thing on their platform... How far do we really need to avoid POSIX
> people from seing something that may remind them that a cross platform
> library may have to concern itself with something else than POSIX matters?
>
>
>  I want to make sure I understand this correctly. Suppose this were done:
>>
>> In file include/cdio/disc.h:
>>
>>   extern const char *discmode2str[]; /* Actually, no change to this file.
>> */
>>
>> and in file include/cdio/mswin.h:
>>
>>   LIBCDIO_DLL_DECLSPEC extern const char *discmode2str[];
>>
>> And in cdio/cdio.h a conditional include of cdio/mswin.h
>>
>
> Well, unless you place #ifndef _MSC_VER around your disc.h extern const,
> you will get massive amounts of warnings in MSVC. And to avoid these
> warning, and still avoided the macro, it'll cost of 2 added lines of
> #ifdefs for each extern.
>
> And of course, since the code IS going to be modified by people who aren't
> Windows aware, someday they're going to apply what they see an an innocuous
> change in disc.h, such as switching char to wchar_t or, more likely, add
> another variable that they won't have any clue needs to have a declspec
> qualifier (why would they, when everything has been done to hide the fact
> from them), and lo and behold, Windows is broken. Whereas, if we go with
> the DECLSPEC macro approach, libcdio developers with eyes open will
> probably reckon that if most of the externs they see have such a macro,
> they probably should use one when introducing a new global variable.
>
>
>  Will this cause a compilation error?
>>
>
> Not an error, but about 30 warnings just for the discmode2str.
>
> 2>D:\libcdio\include\cdio/**disc.h(59): error C2370: 'discmode2str' :
> redefinition; different storage class
> 2>          D:\libcdio\include\cdio/disc.**h(58) : see declaration of
> 'discmode2str'
>
> At the very least, you will get more than 100 warnings if we go this route
> when recompiling in MSVC.
>
> Also, you have to make sure you know which of the first or second
> declaration the compiler will pick (I'd expect the first, but it's
> Microsoft, so I wouldn't be surprised if they don't pick the logical
> choice).
>
> The only safe option here is to add #ifndef _MSC_VER around each extern...
> all for the sake of not seeing a blank macro.
>
> Please realize that I didn't introduce the changes I would like you to
> apply because it simply was the easiest/faster way for me. It genuinely is
> the approach that I see as least likely to cause issues in the long run,
> and reduce the amount of maintenance required for Windows. Otherwise, I'd
> probably be pushing for the introduction of a calling convention as well as
> support for MinGW/MSVC DLL interchangeability, since these are problems we
> solved for libusb. But these changes would be a lot more intrusive.
>
>
>  There are already going to be incompatible changes. One could add the
>> getters and setters and for Windows that would help. So this is a
>> direction
>> am not against.  Over time we could deprecate access to global variables.
>>
>
> If we introduce getters and setters, I'd say we need to either remove or
> deprecate direct access immediately. Else we'll incur portability issues. I
> actually wouldn't mind seeing a 1.0 version that removes direct access to
> all global variables.
>
>
>  I'd like to hear how others feel on this.
>>
>
> Same here. And I too would like to make sure we explored all
> possibilities. We faced similar problems with libusb, so we explored quite
> a few solutions there (and as far as I'm aware, libusb is the only library
> out there that offers DLL interchangeability for MinGW and MSVC), but there
> may be more. One option that was often suggested to us, to solve the issue
> of having to maintain two sets of project files for autotools and MSVC was
> to switch to CMake, since CMake can automate the generation of MSVC
> projects.
>
> Regards,
>
> /Pete
>
>


reply via email to

[Prev in Thread] Current Thread [Next in Thread]