libcdio-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Libcdio-devel] Pete Batard's Changes (MinGW, MSVC, UDF, Joliet, hea


From: Pete Batard
Subject: Re: [Libcdio-devel] Pete Batard's Changes (MinGW, MSVC, UDF, Joliet, header ...) merged in. Leon's CD-Text changes coming
Date: Thu, 15 Mar 2012 15:40:40 +0000
User-agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:10.0.2) Gecko/20120216 Thunderbird/10.0.2

On 2012.03.15 12:17, Rocky Bernstein wrote:
There is an assertion that if it were just that for one little file,
version.h, life would be so simple on MSVC in building from git. This may
be true, but only from a very limited point of view. And one I don't want
to promote.

Then you are probably going to alienate MSVC users.

First, there's the fact that git *itself* is an added tool. But if you want
to build manual pages for programs like cd-info, you will need "help2man";
for the documentation that can be read in Emacs, "makeinfo" is needed.  At
one point I considered redoing CUE parsing using "Bison'; there is a
project already that does this that we might use.

Which is fine when you run POSIX-like environments, because they integrate very well with your existing development toolchain.
But MSVC != POSIX.

It is also possible in
the future we will handle internationalization using "gettext".

For what is worth, we tried that in libusb, and had to give it up after finding out that, like many other POSIX tools, gettext didn't play too well with MSVC.

So although if one looks at things only from the narrow view of developing
from git sources on MSVC as libcdio is today -- excluding documentation and
other niceties (or I would argue necessities) such as the ability to run
the built-in checks or PDF documentation -- then, yes, having to deal with
version.h is currently a problem. In addition to the automated ways
previously in this thread to solve this specific problem for MSVC, I do not
see a great need to use the git repository.

Bug is fixed in git repo or new feature is added, and next tarball; release is whenever => developer wants to use git.

Also, if developer finds bug, they may want to feed it back => better be in a position to do so right from the start. If I'm starting to pick up issues with software that I reuse, even if I originally picked a tarbal, then I'm probably going to switch to using the git version so that I can:
1. ensure that I'm not missing something that is only fixed in git
2. feed back patches if I address problems.

This is another discussion we've had in libusb. And there again, similar justification was attempted by the maintainers, who saw something that was expected to be an annoyance in git repo as fair game not to include, especially if this was Windows related, and that it was OK to consider that none of the people the git-annoyance was targeting at helping would be using git in the first place.

Personally, I would say that if you provide 2 public methods of accessing the source, you must consider that people will use these methods indiscriminately. But you're the maintainer of the project, so if you want to assert your right to decide how people should access the source, so be it.

As a side note, as far as libusb is concerned, the only choice for MSVC users who want to build from source is still git, because our last tarball was released about 2 years ago and doesn't include any Windows support.

Compare working from the git repository with working with a tarball or a
zip file. (We should provide a zip file to make it easier for MSVC users).

Nah, 7-zip is well established and free, so a tar.gz should do fine. I'd only provide a .zip if we get actual request.

So the autotools portion is only a small portion of handling a larger and
more complete problem which already isn't handled very well. Again, how
exactly on MSVC do you expect end-user installer to run the current test
suite if they aren't going to install additional tools?

I don't expect MSVC users to run any libcdio tests. Ever.

Why? Because it'll be way too time demanding to implement, and I doubt you will ever get someone who wants to do do it. I know I won't.

Let's look at our solutions here:

1. Create a myriad of manually-maintained MSVC solution files (one for each test executable), make sure they are generated in a location where autotools will expect them, and somehow modify the very gcc/make specific testing scripts to accept that executables it knows nothing about are fine to use. Not sure if installing ms-sys and related dependencies for just that scenario is going to be strightforward. Oh, and there's a good chance that we'll run into the issues that make the MinGW guys encapsulate the binary executable along with a shell script in their .exe's, to address things like translation from shell and other matters. Add shared library testing (DLL) and I anticipate more potential for breakage. And of course, anything we add for MSVC will have to be cross tested for to make sure it doesn't impact the existing POSIX tests. At best, I see a one man-month project, pretty-much full time. One may be able to reduce the time investment by adding a requirement of first running the test in MinGW/msys, since you'll be sure that the dependencies are met, and then replace the exes with the MSVC ones, but it's not going to be a walk in the park for anyone.

2. Use WDK and batch scripts (since WDK provides the same compilation tools as MSVC, so should generate very similar executables). At least with this method we could get both compilation and test from the same script. But we have to duplicate all the test logic into a batch, and ensure that these are kept in sync. Not gonna happen, especially as the next release of WDK will not have compilation tools, but will required Visual Studio, so anything we do today would be useless for upcoming WDK versions.

3. Pure MSVC tests, with some MSVC/Visual Studio testing logic (or some external testing tool). Same caveats as above with regards to having to maintain the test logic in two separate locations that must manually be kept in sync.

Unless you can think of something else to easily duplicate testing for MSVC, testing of libcdio for MSVC is not going to happen. The closest we will have is MinGW testing, with the hope that the binaries will not deviate to much.

And hand hacking the MSVC project files for the libcdio code, as it
currently exists, is already prone to future breakage. I don't know to what
extent you consulted the MSVC project files that were provided by XBOX, but
the commit message where these old MSVC project files were removed seemed
to give me the impression that they were largely useless.

Yes they were. Because they failed to build anything altogether (well, actually, the project files wouldn't convert for VS2010, but many files referenced in the project either were not existing or had been moved, so it'd never have built).

As long as we have project files that can build the executables or libraries they were designed for, so that these executables can be used and tested with applications, that's fine with me. Unless you want to spend large amount of times duplicating items that are automated on POSIX, a build that works is, and reliance of MSVC users to report issues, is probably as far as you'll go.

If we were to
split out CD-Text to a separate library, the MSVC project files would need
hand modification. Right? My point here isn't about how to make work for
MSVC developers, but rather that a more automatic solution is ultimately
needed, especially if you want to make it easy to work from git sources on
MSVC.

Then the only choice is switching libcdio to CMake and dropping autotools. Not just for Windows, but for all platforms. If you consider that the project shouldn't maintain 2 set of files for MSVC, that's the only option I see that can automate project files generation and prevent manual duplication

And again, same discussions we've had on libusb, with the outcome that making everybody ditch autotools because of MSVC was not acceptable. Oh, and for the record, even CMake requires the end-user to have MSVC installed to generate the project files, presumably because crafting a full-fledged MSVC project file from a list of sources is not as straightforward as one would think it is.
Also no idea about testing in CMake.

Another area of disagreement is how important and pervasive it will be to
build from git as opposed to from a tarball.

Yes, and I've already answered that. Outside of polling MSVC developers directly with regards to their intention, I can only challenge the idea that we can predict with any degree of certainty that MSVC users should not need git access.

If binaries were distributed, that's what most would prefer

Likely, but not sure. That depends on how often the binaries are generated. If it's once every release, that may not be enough. And you may find that MinGW people will want to use binaries as well, which means that they need to be interchangeable. There's also the matter of setting the calling convention, which libcdio would now have to do (will require one macro before each public call in the headers), and providing both 32 & 64 bit.

And, once again, I've been doing that for libusb (since Windows users cannot access a source that's relevant for them outside of git), and after having seen the outcome, I see it as much preferable to encourage Windows developers to build from source, because it will be far less troublesome to support as well as avoid additional header modifications.

Failing that, I think most would prefer a tarball or zip file. But
someone serious about developing from the bleading-edge git should be
prepared to install additional tools as necessary. Again, this is not
limited to a single platform or libcdio.

Provided the tools integrate well, and that we don't have to write a bunch of additional scripts to make said tools work for MSVC.

If you were to require MSVC people to install sed when using git, in order to generate a version.h from version.h.in and configure.ac, you cannot avoid having to maintain extra logic to duplicate what autotools does.

That is unless you want to make it a requirement for MSVC people to install autotools. Unlike what is the case for POSIX, I'm not aware of a standalone autotools package that you can just extract and get running. For Windows, you'll probably need ms-sys, which also requires a bit of configuration (though that has improved a bit). And of course you have to document the whole thing, test it, deal with users who fail to get it working...

Avoidable complexity

Finally we come to the issue of version.h. It is bad practice to add into
version control any derived or automatically-generated file.

Yes I agree. I'd like to avoid it. But there also exists situations where following rules blindly creates more problems, and this is one of them.

No one checks
into version control the autotools-derived files "configure' or
"Makefile.in".

I do [1] (and I've been doing it for some time).
Granted, I'm doing it because I only target one platform, and because github does provide ready-made tarballs for each tagged release, which avoids having to upload a separate tar, but I'm sure you'll find other projects that do this. I think I have actually seen a few.

But these are universally distributed in autotools-derived
tarballs.

Yes, and therefore I fail to see why it shouldn't be done in git as well.

Worse is to check in something a derived file that is sometimes wrong. That
is why "Makefile" is never in version control, even though one can argue
that all Makefiles should be reliably overwritten when configure is run.

No, what one will argue is that the de-facto build sequence for most configure based projects (at least from official tarballs) is configure then make, with configure filling in options that can only be determined on the user's platform. Completely different from a version.h which is not platform dependent, and which we actually know precisely the content we want it to have at any one time, even before configure or autoconf is run.

You are comparing apples and oranges.

When you check in something that is derived a couple things happen. First,
there are extra commits just to keep the derived thing up to date. Second
and worse is that it is possible to inadvertently check in an erroneous
file.

Yes, and this is what I asked you about, as this is the only question that matters in our case. The question I asked was whether you thought that, if we went with version.h+version.h.in in git, after changing the version in configure.ac, you would get into situations where the files would be out of sync. If you say you will, that's enough justification for me to try to work something else. But that is _still_ the only reason I see for not going this way.

I have had problems from users  precisely when I had derived files checked
into version control that were incorrect for the user's environment.

As I stated above, there are differences with autogenerated files, and given that you are talking about environmental matters, whereas versioning has nothing to do with environment, I cannot help but not be entirely convinced that what you experienced applies to the current situation, even more so as you haven't clarified the files that were involved then.

So just take that as a fact and please don't ask about it again.

OK. I'm still not convinced, but sure, you want to consider the matter closed, let's consider it closed. But since I still want to help MSVC git building libcdio, without incurring extra requirements, I'll add a pre-build step to generate the version.h in MSVC and we'll take it from here.

Regards,

/Pete

[1] https://github.com/pbatard/rufus



reply via email to

[Prev in Thread] Current Thread [Next in Thread]